Configure cloud import and export locations
- System administrators can restrict users from creating locations, as described in Configure whether users can create locations. If you can’t create locations as described in this section, contact your system administrator.
- A location can be edited only by the user who created it or by a system administrator.
After you configure a cloud account, you can configure a location on that account. A single location can be used for any one of the following purposes (a single location cannot be associated with multiple purposes):
- Exporting files using Data Feeds
- Exporting reports using Data Warehouse
- Importing schemas using Classification sets
You must configure ÃÛ¶¹ÊÓƵ Analytics with the necessary information to access your cloud account. This process consists of adding and configuring the account (such as Amazon S3 Role ARN, Google Cloud Platform, and so forth) as described in Configure cloud import and export accounts, and then adding and configuring the location within that account (as described in this article).
For information about how to view and delete existing locations, see Locations manager.
Begin creating or editing a location
-
In ÃÛ¶¹ÊÓƵ Analytics, select Components > Locations.
-
On the Locations page, select the Locations tab.
-
(Conditional) If you are a system administrator, you can enable the View locations for all users option to view locations created by all users in your organization.
-
To add a new location, select Add location. (If you haven’t already added an account, add one as described in Configure cloud import and export accounts.)
The Add location dialog displays
Or
To edit an existing location, select the 3-dot menu next to the location name, then select Edit.
The Location details dialog displays.
-
Specify the following information:
table 0-row-2 1-row-2 2-row-2 3-row-2 4-row-2 5-row-2 Field Function Name The name of the location. Description Provide a short description of the account to help differentiate it from other accounts of the same account type. Use with Select whether you want to use this location with Data Feeds, Data Warehouse, or Classification sets.
Consider the following when making a selection:
- A single location cannot be used for multiple purposes. For example, a location that is used for Data Feeds cannot also be used for Data Warehouse or Classification sets.
- To avoid file conflicts within a location, don’t change the value of the Use with field after the location has been used.
- If you are creating a location for an Email account, select Data Warehouse in this field. Email locations are not supported iwth Data Feeds and Classification sets.
Make location available to all users in your organization Enable this option to allow other users in your organization to use the location.
Consider the following when sharing locations:
- Locations that you share cannot be unshared.
- Shared locations can be edited only by the owner of the location.
- Locations can be shared only if the account that the location is associated with is also shared.
Location account Select the location account where you want to create this location. For information about how to create an account, see Configure cloud import and export accounts. -
To complete the form for configuring the location, continue with the section below that corresponds to the account type that you selected in the Location accounts field. (Additional legacy account types are also available, but are not recommended.)
Amazon S3 Role ARN
To configure an Amazon S3 Role ARN location, specify the following information:
-
Begin creating or editing a location, as described above.
table 0-row-2 1-row-2 2-row-2 layout-auto Field Function Bucket The bucket within your Amazon S3 account where you want ÃÛ¶¹ÊÓƵ Analytics data to be sent.
Ensure that the User ARN that was provided by ÃÛ¶¹ÊÓƵ has the
S3:PutObject
permission in order to upload files to this bucket.Bucket names must meet specific naming rules. For example, they must be between 3 to 63 characters long, can consist only of lowercase letters, numbers, dots (.), and hyphens (-), and must begin and end with a letter or number. .
Prefix The folder within the bucket where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example, folder_name/ -
Select Save.
You can now import or export data to or from the account and location that you configured. To export data, use Data Feeds or Data Warehouse. To import data, use Classification sets.
Imported data is not deleted from the cloud destination after it is imported.
note note NOTE If you previously used FTP to import classifications to ÃÛ¶¹ÊÓƵ Analytics, you needed to upload a FIN file. This FIN file is not needed when importing from cloud accounts.
Google Cloud Platform
To configure a Google Cloud Platform location, specify the following information:
-
Begin creating or editing a location, as described above.
table 0-row-2 1-row-2 2-row-2 layout-auto Field Function Bucket The bucket within your GCP account where you want ÃÛ¶¹ÊÓƵ Analytics data to be sent. Ensure that you have granted permission to the Principal provided by ÃÛ¶¹ÊÓƵ to upload files to this bucket. Prefix The folder within the bucket where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example, folder_name/ -
Select Save.
You can now import or export data to or from the account and location that you configured. To export data, use Data Feeds or Data Warehouse. To import data, use Classification sets.
Imported data is not deleted from the cloud destination after it is imported.
note note NOTE If you previously used FTP to import classifications to ÃÛ¶¹ÊÓƵ Analytics, you needed to upload a FIN file. This FIN file is not needed when importing from cloud accounts.
Azure SAS
To configure an Azure SAS location, specify the following information:
-
Begin creating or editing a location, as described above.
table 0-row-2 1-row-2 2-row-2 layout-auto Field Function Container The container within the account you specified where you want ÃÛ¶¹ÊÓƵ Analytics data to be sent. Prefix The folder within the container where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example, folder_name/
-
Select Save.
You can now import or export data to or from the account and location that you configured. To export data, use Data Feeds or Data Warehouse. To import data, use Classification sets.
Imported data is not deleted from the cloud destination after it is imported.
note note NOTE If you previously used FTP to import classifications to ÃÛ¶¹ÊÓƵ Analytics, you needed to upload a FIN file. This FIN file is not needed when importing from cloud accounts.
Azure RBAC
To configure an Azure RBAC location, specify the following information:
-
Begin creating or editing a location, as described above.
table 0-row-2 1-row-2 2-row-2 3-row-2 layout-auto Field Function Account The Azure storage account. Container The container within the account you specified where you want ÃÛ¶¹ÊÓƵ Analytics data to be sent. Ensure that you grant permissions to upload files to the Azure application that you created earlier. Prefix The folder within the container where you want to put the data. Specify a folder name, then add a backslash after the name to create the folder. For example, folder_name/
-
Select Save.
You can now import or export data to or from the account and location that you configured. To export data, use Data Feeds or Data Warehouse. To import data, use Classification sets.
Imported data is not deleted from the cloud destination after it is imported.
note note NOTE If you previously used FTP to import classifications to ÃÛ¶¹ÊÓƵ Analytics, you needed to upload a FIN file. This FIN file is not needed when importing from cloud accounts.
To configure an email location, specify the following information:
-
Begin creating or editing a location, as described above.
table 0-row-2 1-row-2 2-row-2 layout-auto Field Function Subject The subject of the email message. Notes The content of the email message. -
Select Save.
You can now export data to the account and location that you configured when using Data Feeds. (Email locations are not supported with Data Warehouse or Classification sets).
Legacy account types
These legacy account types are available only when exporting data with Data Feeds and Data Warehouse. These options are not available when importing data with Classification sets.
Data feed data can be delivered to an ÃÛ¶¹ÊÓƵ or customer-hosted FTP location. Specify the directory Use the path field to place feed files in a folder.
table 0-row-2 1-row-2 layout-auto | |
---|---|
Field | Function |
Directory path | Enter the path to the directory on the FTP server. Folders must already exist; feeds throw an error if the specified path does not exist. For example, /folder_name/folder_name . |
Data feed data can be delivered to an ÃÛ¶¹ÊÓƵ or customer-hosted SFTP location. The destination site must contain a valid RSA or DSA public key. You can download the appropriate public key when creating the feed.
table 0-row-2 1-row-2 layout-auto | |
---|---|
Field | Function |
Directory path | Enter the path to the directory on the FTP server. Folders must already exist; feeds throw an error if the specified path does not exist. For example, /folder_name/folder_name . |
You can send warehouse data directly to Amazon S3 buckets. This destination type requires a Bucket name, an Access Key ID, and a Secret Key. See within the Amazon S3 docs for more information.
The user you provide for uploading data warehouse data must have the following :
- s3:GetObject
- s3:PutObject
- s3:PutObjectAcl
The following 16 standard AWS regions are supported (using the appropriate signature algorithm where necessary):
- us-east-2
- us-east-1
- us-west-1
- us-west-2
- ap-south-1
- ap-northeast-2
- ap-southeast-1
- ap-southeast-2
- ap-northeast-1
- ca-central-1
- eu-central-1
- eu-west-1
- eu-west-2
- eu-west-3
- eu-north-1
- sa-east-1
note note |
---|
NOTE |
The cn-north-1 region is not supported. |
Data warehouse support Azure Blob destinations. Requires a container, account, and a key. Amazon automatically encrypts the data at rest. When you download the data, it gets decrypted automatically. See within the Microsoft Azure docs for more information.
note note |
---|
NOTE |
You must implement your own process to manage disk space on the data warehouse destination. ÃÛ¶¹ÊÓƵ does not delete any data from the server. |