ÃÛ¶¹ÊÓƵ

Create a data feed

When creating a data feed, you provide ÃÛ¶¹ÊÓƵ with:

  • The information about the destination where you want raw data files to be sent

  • The data you want to include in each file

NOTE
Before you create a data feed, it’s important to have a basic understanding of data feeds and to ensure that you meet all prerequisites. For more information, see Data feeds overview.

Create and configure a data feed

  1. Log in to using your ÃÛ¶¹ÊÓƵ ID credentials.

  2. Select the 9-square icon in the upper-right, then select Analytics.

  3. In the top navigation bar, go to Admin > Data feeds.

  4. Select Add.

    Add data feed

    A page displays with three main categories: Feed information, Destination, and Data column definitions.

  5. In the Feed Information section, complete the following fields:

    table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 6-row-2 7-row-2
    Field Function
    Name The name of the data feed. Must be unique within the selected report suite, and can be up to 255 characters in length.
    Report suite The report suite that the data feed is based on. If multiple data feeds are created for the same report suite, they must have different column definitions. Only source report suites support data feeds; virtual report suites are not supported.
    Email when complete The email address to be notified when a feed finishes processing. The email address must be properly formatted.
    Feed interval Select Daily for backfill or historical data. Daily feeds contain a full day’s worth of data, from midnight to midnight in the report suite’s time zone. Select Hourly for continuing data (Daily is also available for continuing feeds if you prefer). Hourly feeds contain a single hour’s worth of data.
    Delay processing Wait a given amount of time before processing a data feed file. A delay can be useful to give mobile implementations an opportunity for offline devices to come online and send data. It can also be used to accommodate your organization’s server-side processes in managing previously processed files. In most cases, no delay is needed. A feed can be delayed by up to 120 minutes.
    Start & end dates The start date indicates the date when you want the data feed to begin. To immediately begin processing data feeds for historical data, set this date to any date in the past when data is being collected. The start and end dates are based on the report suite’s time zone.
    Continuous feed This checkbox removes the end date, allowing a feed to run indefinitely. When a feed finishes processing historical data, a feed waits for data to finish collecting for a given hour or day. Once the current hour or day concludes, processing begins after the specified delay.
  6. In the Destination section, in the Type drop-down menu, select the destination where you want the data to be sent.

    note note
    NOTE
    Consider the following when configuring a report destination:
    • We recommend using a cloud account for your report destination. Legacy FTP and SFTP accounts are available, but are not recommended.

    • Any cloud accounts that you previously configured are available to use for Data Feeds. You can configure cloud accounts in any of the following ways:

    • Cloud accounts are associated with your ÃÛ¶¹ÊÓƵ Analytics user account. Other users cannot use or view cloud accounts that you configure.

    • You can edit any locations that you create from the Locations manager in Components > Locations

    Data feed destination drop-down menu

    Use any of the following destination types when creating a data feed. For configuration instructions, expand the destination type. (Additional legacy destinations are also available, but are not recommended.)

    accordion
    Amazon S3

    You can send feeds directly to Amazon S3 buckets. This destination type requires only your Amazon S3 account and the location (bucket).

    ÃÛ¶¹ÊÓƵ Analytics uses cross-account authentication to upload files from ÃÛ¶¹ÊÓƵ Analytics to the specified location in your Amazon S3 instance.

    To configure an Amazon S3 bucket as the destination for a data feed:

    1. Begin creating a data feed as described in Create and configure a data feed.

    2. In the Destination section, in the Type drop-down menu, select Amazon S3.

      Amazon S3 destination

    3. Select Select location.

      The Amazon S3 Export Locations page is displayed.

    4. (Conditional) If an Amazon S3 account (and a location on that account) has already been configured in ÃÛ¶¹ÊÓƵ Analytics, you can use it as your data feed destination:

      note note
      NOTE
      Accounts are available to you only if you configured them or if they were shared with an organization you are a part of.
      1. Select the account from the Select account drop-down menu.

        Any cloud accounts that were configured in any of the following areas of ÃÛ¶¹ÊÓƵ Analytics are available to use:

      2. Select the location from the Select location drop-down menu.

      3. Select Save > Save.

      The destination is now configured to send data to the Amazon S3 location that you specified.

    5. (Conditional) If you have not previously added an Amazon S3 account:

      1. Select Add account, then specify the following information:

        table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 layout-auto
        Field Function
        Account name A name for the account. This can be any name you choose.
        Account description A description for the account.
        Role ARN You must provide a Role ARN (Amazon Resource Name) that ÃÛ¶¹ÊÓƵ can use to gain access to the Amazon S3 account. To do this, you create an IAM permission policy for the source account, attach the policy to a user, and then create a role for the destination account. For specific information, see .
        User ARN The User ARN (Amazon Resource Name) is provided by ÃÛ¶¹ÊÓƵ. You must attach this user to the policy you created.
      2. Select Add location, then specify the following information:

        table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 layout-auto
        Field Function
        Name A name for the account.
        Description A description for the account.
        Bucket

        The bucket within your Amazon S3 account where you want ÃÛ¶¹ÊÓƵ Analytics data to be sent.

        Ensure that the User ARN that was provided by ÃÛ¶¹ÊÓƵ has the S3:PutObject permission in order to upload files to this bucket. This permission allows the User ARN to upload initial files and overwrite files for subsequent uploads.

        Bucket names must meet specific naming rules. For example, they must be between 3 to 63 characters long, can consist only of lowercase letters, numbers, dots (.), and hyphens (-), and must begin and end with a letter or number. .

        Prefix The folder within the bucket where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example, folder_name/
      3. Select Create > Save.

        The destination is now configured to send data to the Amazon S3 location that you specified.

      4. (Conditional) If you need to manage the destination (account and location) that you just created, it is available in the Locations manager.

    accordion
    Azure RBAC

    You can send feeds directly to an Azure container by using RBAC authentication. This destination type requires an Application ID, Tenant ID, and Secret.

    To configure an Azure RBAC account as the destination for a data feed:

    1. If you haven’t already, create an Azure application that ÃÛ¶¹ÊÓƵ Analytics can use for authentication, then grant access permissions in access control (IAM).

      For information, refer to the .

    2. In the ÃÛ¶¹ÊÓƵ Analytics admin console, in the Destination section, in the Type drop-down menu, select Azure RBAC.

      Azure RBAC destination

    3. Select Select location.

      The Azure RBAC Export Locations page is displayed.

    4. (Conditional) If an Azure RBAC account (and a location on that account) has already been configured in ÃÛ¶¹ÊÓƵ Analytics, you can use it as your data feed destination:

      note note
      NOTE
      Accounts are available to you only if you configured them or if they were shared with an organization you are a part of.
      1. Select the account from the Select account drop-down menu.

      Any cloud accounts that you configured in any of the following areas of ÃÛ¶¹ÊÓƵ Analytics are available to use:

      1. Select the location from the Select location drop-down menu.

      2. Select Save > Save.

        The destination is now configured to send data to the Azure RBAC location that you specified.

    5. (Conditional) If you have not previously added an Azure RBAC account:

      1. Select Add account, then specify the following information:

        table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 layout-auto
        Field Function
        Account name A name for the Azure RBAC account. This name displays in the Select account drop-down field and can be any name you choose.
        Account description A description for the Azure RBAC account. This description displays in the Select account drop-down field and can be any name you choose.
        Application ID Copy this ID from the Azure application that you created. In Microsoft Azure, this information is located on the Overview tab within your application. For more information, see the .
        Tenant ID Copy this ID from the Azure application that you created. In Microsoft Azure, this information is located on the Overview tab within your application. For more information, see the .
        Secret Copy the secret from the Azure application that you created. In Microsoft Azure, this information is located on the Certificates & secrets tab within your application. For more information, see the .
      2. Select Add location, then specify the following information:

        table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 layout-auto
        Field Function
        Name A name for the location. This name displays in the Select location drop-down field and can be any name you choose.
        Description A description for the location. This description displays in the Select location drop-down field and can be any name you choose.
        Account The Azure storage account.
        Container The container within the account you specified where you want ÃÛ¶¹ÊÓƵ Analytics data to be sent. Ensure that you grant permissions to upload files to the Azure application that you created earlier.
        Prefix

        The folder within the container where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example, folder_name/

        Make sure the Application ID that you specified when configuring the Azure RBAC account has been granted the Storage Blob Data Contributor role in order to access the container (folder).

        For more information, see .

      3. Select Create > Save.

        The destination is now configured to send data to the Azure RBAC location that you specified.

      4. (Conditional) If you need to manage the destination (account and location) that you just created, it is available in the Locations manager.

    accordion
    Azure SAS

    You can send feeds directly to an Azure container by using SAS authentication. This destination type requires an Application ID, Tenant ID, Key vault URI, Key vault secret name, and secret.

    To configure Azure SAS as the destination for a data feed:

    1. If you haven’t already, create an Azure application that ÃÛ¶¹ÊÓƵ Analytics can use for authentication.

      For information, refer to the .

    2. In the ÃÛ¶¹ÊÓƵ Analytics admin console, in the Destination section, select Azure SAS.

      Azure SAS destination

    3. Select Select location.

      The Azure SAS Export Locations page is displayed.

    4. (Conditional) If an Azure SAS account (and a location on that account) has already been configured in ÃÛ¶¹ÊÓƵ Analytics, you can use it as your data feed destination:

      note note
      NOTE
      Accounts are available to you only if you configured them or if they were shared with an organization you are a part of.
      1. Select the account from the Select account drop-down menu.

        Any cloud accounts that you configured in any of the following areas of ÃÛ¶¹ÊÓƵ Analytics are available to use:

      2. Select the location from the Select location drop-down menu.

      3. Select Save > Save.

        The destination is now configured to send data to the Azure SAS location that you specified.

    5. (Conditional) If you have not previously added an Azure SAS account:

      1. Select Add account, then specify the following information:

        table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 6-row-2 7-row-2 layout-auto
        Field Function
        Account name A name for the Azure SAS account. This name displays in the Select account drop-down field and can be any name you choose.
        Account description A description for the Azure SAS account. This description displays in the Select account drop-down field and can be any name you choose.
        Application ID Copy this ID from the Azure application that you created. In Microsoft Azure, this information is located on the Overview tab within your application. For more information, see the .
        Tenant ID Copy this ID from the Azure application that you created. In Microsoft Azure, this information is located on the Overview tab within your application. For more information, see the .
        Key vault URI

        The path to the SAS URI in Azure Key Vault. To configure Azure SAS, you need to store an SAS URI as a secret using Azure Key Vault. For information, see the .

        After the key vault URI is created:

        • Add an access policy on the Key Vault in order to grant permission to the Azure application that you created.

          For information, see the .

          Or

          If you want to grant an access role directly without creating an access policy, see the . This adds the role assignment for the application ID to access the key vault URI.

        • Make sure the Application ID has been granted the Key Vault Certificate User built-in role in order to access the key vault URI.

          For more information, see .

        Key vault secret name The secret name you created when adding the secret to Azure Key Vault. In Microsoft Azure, this information is located in the Key Vault you created, on the Key Vault settings pages. For information, see the .
        Secret Copy the secret from the Azure application that you created. In Microsoft Azure, this information is located on the Certificates & secrets tab within your application. For more information, see the .
      2. Select Add location, then specify the following information:

        table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 layout-auto
        Field Function
        Name A name for the location. This name displays in the Select location drop-down field and can be any name you choose.
        Description A description for the location. This description displays in the Select location drop-down field and can be any name you choose.
        Container The container within the account you specified where you want ÃÛ¶¹ÊÓƵ Analytics data to be sent.
        Prefix

        The folder within the container where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example, folder_name/

        Make sure that the SAS URI store that you specified in the Key Vault secret name field when configuring the Azure SAS account has the Write permission. This allows the SAS URI to create files in your Azure container.

        If you want the SAS URI to also overwrite files, make sure that the SAS URI store has the Delete permission.

        For more information, see in the Azure Blob Storage documentation.

      3. Select Create > Save.

        The destination is now configured to send data to the Azure SAS location that you specified.

      4. (Conditional) If you need to manage the destination (account and location) that you just created, it is available in the Locations manager.

    accordion
    Google Cloud Platform

    You can send feeds directly to Google Cloud Platform (GCP) buckets. This destination type requires only your GCP account name and the location (bucket) name.

    ÃÛ¶¹ÊÓƵ Analytics uses cross-account authentication to upload files from ÃÛ¶¹ÊÓƵ Analytics to the specified location in your GCP instance.

    To configure a GCP bucket as the destination for a data feed:

    1. In the ÃÛ¶¹ÊÓƵ Analytics admin console, in the Destination section, select Google Cloud Platform.

      Google Cloud Platform destination

    2. Select Select location.

      The GCP Export Locations page is displayed.

    3. (Conditional) If a Google Cloud Platform account (and a location on that account) has already been configured in ÃÛ¶¹ÊÓƵ Analytics, you can use it as your data feed destination:

      note note
      NOTE
      Accounts are available to you only if you configured them or if they were shared with an organization you are a part of.
      1. Select the account from the Select account drop-down menu.

        Any cloud accounts that you configured in any of the following areas of ÃÛ¶¹ÊÓƵ Analytics are available to use:

      2. Select the location from the Select location drop-down menu.

      3. Select Save > Save.

        The destination is now configured to send data to the Google Cloud Platform location that you specified.

    4. (Conditional) If you have not previously added a GCP account:

      1. Select Add account, then specify the following information:

        table 0-row-2 1-row-2 2-row-2 3-row-2 layout-auto
        Field Function
        Account name A name for the account. This can be any name you choose.
        Account description A description for the account.
        Project ID Your Google Cloud project ID. See the .
      2. Select Add location, then specify the following information:

        table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 layout-auto
        Field Function
        Principal The Principal is provided by ÃÛ¶¹ÊÓƵ. You must grant permission to receive feeds to this principal.
        Name A name for the account.
        Description A description for the account.
        Bucket

        The bucket within your GCP account where you want ÃÛ¶¹ÊÓƵ Analytics data to be sent.

        Ensure that you have granted either of the following permissions to the Principal provided by ÃÛ¶¹ÊÓƵ: (For information about granting permissions, see in the Google Cloud documentation.)

        • roles/storage.objectCreator: Use this permission if you want to limit the Principal to only create files in your GCP account.
          Important: If you use this permission with scheduled reporting, you must use a unique file name for each new scheduled export. Otherwise, the report generation will fail because the Principal does not have access to overwrite existing files.
        • (Recommended) roles/storage.objectUser: Use this permission if you want the Principal to have access to view, list, update, and delete files in your GCP account.
          This permission allows the Principal to overwrite existing files for subsequent uploads, without the need to auto-generate unique file names for each new scheduled export.

        If your organization is using to allow only the Google Cloud Platform account in your allow list, you need the following ÃÛ¶¹ÊÓƵ-owned Google Cloud Platform organization ID:

        • DISPLAY_NAME: adobe.com
        • ID: 178012854243
        • DIRECTORY_CUSTOMER_ID: C02jo8puj
        Prefix The folder within the bucket where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example, folder_name/
      3. Select Create > Save.

        The destination is now configured to send data to the GCP location that you specified.

      4. (Conditional) If you need to manage the destination (account and location) that you just created, it is available in the Locations manager.

  7. In the Data Column Definitions section, select the latest All ÃÛ¶¹ÊÓƵ Columns template in the dropdown, then complete the following fields:

    table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 6-row-2 7-row-2 8-row-2
    Field Function
    Remove escaped characters When collecting data, some characters (such as newlines) can cause issues. Check this box if you would like these characters removed from feed files.
    Compression format The type of compression used. Gzip outputs files in .tar.gz format. Zip outputs files in .zip format.
    Packaging type Select Multiple files for most data feeds. This option paginates your data into uncompressed 2GB chunks. (If the Multiple files option is selected and uncompressed data for the reporting window is less than 2GB, one file is sent.) Selecting Single file outputs the hit_data.tsv file in a single, potentially massive file.
    Manifest

    Determines whether ÃÛ¶¹ÊÓƵ should deliver a manifest file to the destination when no data is collected for a feed interval. If you select Manifest File, you receive a manifest file similar to the following when no data is collected:

    text

    Datafeed-Manifest-Version: 1.0

    Lookup-Files: 0

    Data-Files: 0

    Total-Records: 0

    Column templates When creating many data feeds, ÃÛ¶¹ÊÓƵ recommends creating a column template. Selecting a column template automatically includes the specified columns in the template. ÃÛ¶¹ÊÓƵ also provides several templates by default.
    Available columns All available data columns in ÃÛ¶¹ÊÓƵ Analytics. Click Add all to include all columns in a data feed.
    Included columns The columns to include in a data feed. Click Remove all to remove all columns from a data feed.
    Download CSV Downloads a CSV file containing all included columns.
  8. Select Save in the top-right.

    Historical data processing begins immediately. When data finishes processing for a day, the file is sent to the destination that you configured.

    For information about how to access the data feed and to get a better understanding of its contents, see Data feed contents - overview.

Legacy destinations

IMPORTANT
The destinations described in this section are legacy, and are not recommended. Instead, use one of the following destinations when creating a data feed: Amazon S3, Google Cloud Platform, Azure RBAC, or Azure SAS. See Create and configure a data feed for detailed information about each of these recommended destinations.

The following information provides configuration information for each of the legacy destinations:

FTP

Data feed data can be delivered to an ÃÛ¶¹ÊÓƵ or customer-hosted FTP location. Requires an FTP host, username, and password. Use the path field to place feed files in a folder. Folders must already exist; feeds throw an error if the specified path does not exist.

Use the following information when completing the available fields:

  • Host: Enter the desired FTP destination URL. For example, ftp://ftp.omniture.com.
  • Path: Can be left blank
  • Username: Enter the username to log in to the FTP site.
  • Password and confirm password: Enter the password to log in to the FTP site.

SFTP

SFTP support for data feeds is available. Requires an SFTP host, username, and the destination site to contain a valid RSA or DSA public key. You can download the appropriate public key when creating the feed.

S3

You can send feeds directly to Amazon S3 buckets. This destination type requires a Bucket name, an Access Key ID, and a Secret Key. See within the Amazon S3 docs for more information.

The user you provide for uploading data feeds must have the following :

The following 16 standard AWS regions are supported (using the appropriate signature algorithm where necessary):

  • us-east-2
  • us-east-1
  • us-west-1
  • us-west-2
  • ap-south-1
  • ap-northeast-2
  • ap-southeast-1
  • ap-southeast-2
  • ap-northeast-1
  • ca-central-1
  • eu-central-1
  • eu-west-1
  • eu-west-2
  • eu-west-3
  • eu-north-1
  • sa-east-1
NOTE
The cn-north-1 region is not supported.

Azure Blob

Data feeds support Azure Blob destinations. Requires a container, account, and a key. Amazon automatically encrypts the data at rest. When you download the data, it gets decrypted automatically. See within the Microsoft Azure docs for more information.

NOTE
You must implement your own process to manage disk space on the feed destination. ÃÛ¶¹ÊÓƵ does not delete any data from the server.
recommendation-more-help
6b7d49d5-f5fe-4b7f-91ae-5b0745755ed2