ÃÛ¶¹ÊÓƵ

[Beta]{class="badge informative"}

Create a source connection and dataflow for Customer.io using the Flow Service API

NOTE
The Customer.io source is in beta. See the sources overview for more information on using beta-labeled sources.

The following tutorial walks you through the steps to create a Customer.io source connection and dataflow to bring event data to ÃÛ¶¹ÊÓƵ Experience Platform using the .

Getting started getting-started

This guide requires a working understanding of the following components of Experience Platform:

  • Sources: Experience Platform allows data to be ingested from various sources while providing you with the ability to structure, label, and enhance incoming data using Platform services.
  • Sandboxes: Experience Platform provides virtual sandboxes which partition a single Platform instance into separate virtual environments to help develop and evolve digital experience applications.

Connect Customer.io to Platform using the Flow Service API connect-platform-to-flow-api

The following outlines the steps you need to make in order to create a source connection and a dataflow to bring your Customer.io events data to Experience Platform.

Create a source connection source-connection

Create a source connection by making a POST request to the Flow Service API, while providing the connection spec ID of your source, details like name and description, and the format of your data.

API format

POST /sourceConnections

Request

The following request creates a source connection for Customer.io:

curl -X POST \
  'https://platform.adobe.io/data/foundation/flowservice/sourceConnections' \
  -H 'Authorization: Bearer {ACCESS_TOKEN}' \
  -H 'x-api-key: {API_KEY}' \
  -H 'x-gw-ims-org-id: {ORG_ID}' \
  -H 'x-sandbox-name: {SANDBOX_NAME}' \
  -H 'Content-Type: application/json' \
  -d '{
      "name": "Streaming Source Connection for Customer.io",
      "providerId": "521eee4d-8cbe-4906-bb48-fb6bd4450033",
      "description": "Streaming Source Connection for customer.io",
      "connectionSpec": {
          "id": "96479064-7b8a-4d69-b9ed-21c5683837ea",
          "version": "1.0"
      },
      "data": {
          "format": "json"
      }
    }'
Property
Description
name
The name of your source connection. Ensure that the name of your source connection is descriptive as you can use this to look up information on your source connection.
description
An optional value that you can include to provide more information on your source connection.
connectionSpec.id
The connection specification ID that corresponds to your source.
data.format
The format of the Customer.io data that you want to ingest. Currently, the only supported data format is json.

Response

A successful response returns the unique identifier (id) of the newly created source connection. This ID is required in a later step to create a dataflow.

{
    "id": "133bb51f-f310-4b4a-b8b2-731aef1e223c",
    "etag": "\"af00a717-0000-0200-0000-63ef2cbd0000\""
}

Create a target XDM schema target-schema

In order for the source data to be used in Platform, a target schema must be created to structure the source data according to your needs. The target schema is then used to create a Platform dataset in which the source data is contained.

A target XDM schema can be created by performing a POST request to the .

For detailed steps on how to create a target XDM schema, see the tutorial on creating a schema using the API.

Create a target dataset target-dataset

A target dataset can be created by performing a POST request to the , providing the ID of the target schema within the payload.

For detailed steps on how to create a target dataset, see the tutorial on creating a dataset using the API.

Create a target connection target-connection

A target connection represents the connection to the destination where the ingested data is to be stored. To create a target connection, you must provide the fixed connection specification ID that corresponds to the data lake. This ID is: c604ff05-7f1a-43c0-8e18-33bf874cb11c.

You now have the unique identifiers a target schema a target dataset and the connection spec ID to the data lake. Using these identifiers, you can create a target connection using the Flow Service API to specify the dataset that will contain the inbound source data.

API format

POST /targetConnections

Request

The following request creates a target connection for Customer.io:

curl -X POST \
  'https://platform.adobe.io/data/foundation/flowservice/targetConnections' \
  -H 'Authorization: Bearer {ACCESS_TOKEN}' \
  -H 'x-api-key: {API_KEY}' \
  -H 'x-gw-ims-org-id: {ORG_ID}' \
  -H 'x-sandbox-name: {SANDBOX_NAME}' \
  -H 'Content-Type: application/json' \
  -d '{
      "name": "Streaming Target Connection for Customer.io",
      "description": "Streaming Target Connection for Customer.io",
      "connectionSpec": {
          "id": "c604ff05-7f1a-43c0-8e18-33bf874cb11c",
          "version": "1.0"
      },
      "data": {
          "format": "json",
          "schema": {
              "id": "https://ns.adobe.com/extconndev/schemas/945546112b746524bfd9f1264b26c2b7d8e7f5b7fadb953a",
              "version": "application/vnd.adobe.xed-full+json;version=1"
          }
      },
      "params": {
          "dataSetId": "63ec807d3f5ce91bd2d06c65"
      }
  }'
Property
Description
name
The name of your target connection. Ensure that the name of your target connection is descriptive as you can use this to look up information on your target connection.
description
An optional value that you can include to provide more information on your target connection.
connectionSpec.id
The connection specification ID that corresponds to data lake. This fixed ID is: c604ff05-7f1a-43c0-8e18-33bf874cb11c.
data.format
The format of the Customer.io data that you want to ingest.
params.dataSetId
The target dataset ID retrieved in a previous step.

Response

A successful response returns the new target connection’s unique identifier (id). This ID is required in later steps.

{
    "id": "da8b75ad-f6ee-4991-95df-291e62936e98",
    "etag": "\"70003dff-0000-0200-0000-63ef4a090000\""
}

Create a mapping mapping

In order for the source data to be ingested into a target dataset, it must first be mapped to the target schema that the target dataset adheres to. This is achieved by performing a POST request to with data mappings defined within the request payload.

API format

POST /conversion/mappingSets

Request

curl -X POST \
  'https://platform.adobe.io/data/foundation/conversion/mappingSets' \
  -H 'Authorization: Bearer {ACCESS_TOKEN}' \
  -H 'x-api-key: {API_KEY}' \
  -H 'x-gw-ims-org-id: {ORG_ID}' \
  -H 'x-sandbox-name: {SANDBOX_NAME}' \
  -H 'Content-Type: application/json' \
  -d '{
      "outputSchema": {
          "schemaRef": {
              "id": "https://ns.adobe.com/{TENANT_ID}/schemas/945546112b746524bfd9f1264b26c2b7d8e7f5b7fadb953a",
              "contentType": "application/vnd.adobe.xed-full+json;version=1"
          }
      },
    "mappings": [
        {
            "destinationXdmPath": "_extconndev.cio_id",
            "sourceAttribute": "data.identifiers.cio_id",
            "identity": false,
            "version": 0
        },
        {
            "destinationXdmPath": "_extconndev.email",
            "sourceAttribute": "data.identifiers.email",
            "identity": false,
            "version": 0
        },
        {
            "destinationXdmPath": "_extconndev.event_id0",
            "sourceAttribute": "event_id",
            "identity": false,
            "version": 0
        },
        {
            "destinationXdmPath": "_extconndev.metricx",
            "sourceAttribute": "metric",
            "identity": false,
            "version": 0
        },
        {
            "destinationXdmPath": "_extconndev.object_type1",
            "sourceAttribute": "object_type",
            "identity": false,
            "version": 0
        },
        {
            "destinationXdmPath": "_extconndev.timestampx",
            "sourceAttribute": "timestamp",
            "identity": false,
            "version": 0
        }
    ]
  }'
Property
Description
outputSchema.schemaRef.id
The ID of the target XDM schema generated in an earlier step.
mappings.sourceType
The source attribute type that is being mapped.
mappings.source
The source attribute that needs to be mapped to a destination XDM path.
mappings.destination
The destination XDM path where the source attribute is being mapped to.

Response

A successful response returns details of the newly created mapping including its unique identifier (id). This value is required in a later step to create a dataflow.

{
    "id": "59c0e53a2dc84f7791ecc1b3d6e51d5e",
    "version": 0,
    "createdDate": 1676627988129,
    "modifiedDate": 1676627988129,
    "createdBy": "{CREATED_BY}",
    "modifiedBy": "{MODIFIED_BY}"
}

Create a flow flow

The last step towards bringing data from Customer.io to Platform is to create a dataflow. By now, you have the following required values prepared:

A dataflow is responsible for scheduling and collecting data from a source. You can create a dataflow by performing a POST request while providing the previously mentioned values within the payload.

API format

POST /flows

Request

curl -X POST \
  'https://platform.adobe.io/data/foundation/flowservice/flows' \
  -H 'x-api-key: {API_KEY}' \
  -H 'x-gw-ims-org-id: {ORG_ID}' \
  -H 'x-sandbox-name: {SANDBOX_NAME}' \
  -H 'Content-Type: application/json' \
  -d '{
      "name": "Streaming Dataflow for Customer.io",
      "description": "Streaming Dataflow for Customer.io",
      "flowSpec": {
          "id": "e77fde5a-22a8-11ed-861d-0242ac120002",
          "version": "1.0"
      },
      "sourceConnectionIds": [
          "133bb51f-f310-4b4a-b8b2-731aef1e223c"
      ],
      "targetConnectionIds": [
          "da8b75ad-f6ee-4991-95df-291e62936e98"
      ],
      "transformations": [
      {
        "name": "Mapping",
        "params": {
          "mappingId": "59c0e53a2dc84f7791ecc1b3d6e51d5e",
          "mappingVersion": 0
        }
      }
    ]
  }'
Property
Description
name
The name of your dataflow. Ensure that the name of your dataflow is descriptive as you can use this to look up information on your dataflow.
description
An optional value that you can include to provide more information on your dataflow.
flowSpec.id
The flow specification ID required to create a dataflow. This fixed ID is: e77fde5a-22a8-11ed-861d-0242ac120002.
flowSpec.version
The corresponding version of the flow specification ID. This value defaults to 1.0.
sourceConnectionIds
The source connection ID generated in an earlier step.
targetConnectionIds
The target connection ID generated in an earlier step.
transformations
This property contains the various transformations that are needed to be applied to your data. This property is required when bringing non-XDM-compliant data to Platform.
transformations.name
The name assigned to the transformation.
transformations.params.mappingId
The mapping ID generated in an earlier step.
transformations.params.mappingVersion
The corresponding version of the mapping ID. This value defaults to 0.

Response

A successful response returns the ID (id) of the newly created dataflow. You can use this ID to monitor, update, or delete your dataflow.

{
    "id": "4982698b-e6b3-48c2-8dcf-040e20121fd2",
    "etag": "\"4c012103-0000-0200-0000-63ef57db0000\""
}

Get your streaming endpoint URL get-streaming-endpoint-url

With your dataflow created, you can now retrieve your streaming endpoint URL. You will use this endpoint URL to subscribe your source to a webhook, allowing your source to communicate with Experience Platform.

To retrieve your streaming endpoint URL, make a GET request to the /flows endpoint and provide the ID of your dataflow.

API format

GET /flows/{FLOW_ID}

Request

curl -X GET \
  'https://platform.adobe.io/data/foundation/flowservice/flows/4982698b-e6b3-48c2-8dcf-040e20121fd2' \
  -H 'Authorization: Bearer {ACCESS_TOKEN}' \
  -H 'x-api-key: {API_KEY}' \
  -H 'x-gw-ims-org-id: {ORG_ID}' \
  -H 'x-sandbox-name: {SANDBOX_NAME}'

Response

A successful response returns information on your dataflow, including your endpoint URL, marked as inletUrl. Refer to the Setup Webhook page to obtain the required value.

{
    "items": [
        {
            "id": "4982698b-e6b3-48c2-8dcf-040e20121fd2",
            "createdAt": 1676629979503,
            "updatedAt": 1676629985390,
            "createdBy": "acme@ÃÛ¶¹ÊÓƵID",
            "updatedBy": "acme@ÃÛ¶¹ÊÓƵID",
            "createdClient": "{CREATED_CLIENT}",
            "updatedClient": "{UPDATED_CLIENT}",
            "sandboxId": "{SANDBOX_ID}",
            "sandboxName": "{SANDBOX_NAME}",
            "imsOrgId": "{ORG_ID}",
            "name": "Streaming Dataflow for Customer.io",
            "description": "Streaming Dataflow for Customer.io",
            "flowSpec": {
                "id": "e77fde5a-22a8-11ed-861d-0242ac120002",
                "version": "1.0"
            },
            "state": "enabled",
            "version": "\"4c01c003-0000-0200-0000-63ef57e10000\"",
            "etag": "\"4c01c003-0000-0200-0000-63ef57e10000\"",
            "sourceConnectionIds": [
                "133bb51f-f310-4b4a-b8b2-731aef1e223c"
            ],
            "targetConnectionIds": [
                "da8b75ad-f6ee-4991-95df-291e62936e98"
            ],
            "inheritedAttributes": {
                "properties": {
                    "isSourceFlow": true
                },
                "sourceConnections": [
                    {
                        "id": "133bb51f-f310-4b4a-b8b2-731aef1e223c",
                        "connectionSpec": {
                            "id": "96479064-7b8a-4d69-b9ed-21c5683837ea",
                            "version": "1.0"
                        }
                    }
                ],
                "targetConnections": [
                    {
                        "id": "da8b75ad-f6ee-4991-95df-291e62936e98",
                        "connectionSpec": {
                            "id": "c604ff05-7f1a-43c0-8e18-33bf874cb11c",
                            "version": "1.0"
                        }
                    }
                ]
            },
            "options": {
                "inletUrl": "https://dcs.adobedc.net/collection/e75dcb5247eb65e7385df30270192e80b145566f52ed74d570505bd2e82463f3"
            },
            "transformations": [
                {
                    "name": "Mapping",
                    "params": {
                        "mappingId": "59c0e53a2dc84f7791ecc1b3d6e51d5e",
                        "mappingVersion": 0
                    }
                }
            ],
            "runs": "/runs?property=flowId==4982698b-e6b3-48c2-8dcf-040e20121fd2",
            "providerRefId": "c4726e6f-64b4-4b3b-97e3-f128ace0cc74",
            "lastOperation": {
                "started": 0,
                "updated": 0,
                "operation": "enable"
            }
        }
    ]
}

Appendix appendix

The following section provides information on the steps you can to monitor, update, and delete your dataflow.

Monitor your dataflow monitor-dataflow

Once your dataflow has been created, you can monitor the data that is being ingested through it to see information on flow runs, completion status, and errors. For complete API examples, read the guide on monitoring your sources dataflows using the API.

Update your dataflow update-dataflow

Update the details of your dataflow, such as its name and description, as well as its run schedule and associated mapping sets by making a PATCH request to the /flows endpoint of Flow Service API, while providing the ID of your dataflow. When making a PATCH request, you must provide your dataflow’s unique etag in the If-Match header. For complete API examples, read the guide on updating sources dataflows using the API

Update your account update-account

Update the name, description, and credentials of your source account by performing a PATCH request to the Flow Service API while providing your base connection ID as a query parameter. When making a PATCH request, you must provide your source account’s unique etag in the If-Match header. For complete API examples, read the guide on updating your source account using the API.

Delete your dataflow delete-dataflow

Delete your dataflow by performing a DELETE request to the Flow Service API while providing the ID of the dataflow you want to delete as part of the query parameter. For complete API examples, read the guide on deleting your dataflows using the API.

Delete your account delete-account

Delete your account by performing a DELETE request to the Flow Service API while providing the base connection ID of the account you want to delete. For complete API examples, read the guide on deleting your source account using the API.

recommendation-more-help
337b99bb-92fb-42ae-b6b7-c7042161d089