Amplitude users now have the ability to load their Amplitude event data into their Snowflake account. You can set up recurring syncs through the Amplitude UI, as well as manually initiate a sync of your historical data.
NOTE: Snowflake direct load is currently in Beta release, available to all paid customers.
Set up a recurring data export to Snowflake
To set up a recurring export of your Amplitude data to Snowflake, follow these steps:
NOTE: You will need admin privileges in Amplitude, as well as a role that allows you to enable resources in Snowlake.
- Navigate to Sources and Destinations → Destinations.
- Under Add More Destinations …, click the Snowflake panel.
The Export Data to Snowflake page will open to the Getting Started tab.
- Under Export Data to Snowflake, select the data you’d like to export.
NOTE: This is currently limited to events ingested today and in the future.
- Review the Event table and Merge IDs table schemas and click Next >. The Set Up Export tab will open.
- In the Snowflake Credentials For Amplitude section, enter the following information:
- Account Name: This is the account name on your Snowflake account. It’s the first part of your Snowflake URL, before ‘snowflakecomputing.com’
- Warehouse: The warehouse Amplitude will use to load the data. Ideally, this will be a warehouse dedicated to loading Amplitude data; this way, other regular Snowflake operations will not be disrupted.
- Database: The database where the data will be stored. Similarly, this database should also be dedicated specifically to Amplitude data.
- Username: The username Amplitude will use to connect to the Snowflake account.
- Password: The password associate with the username
NOTE: These credentials are case-sensitive, so keep that in mind.
- Next to the credentials section, Amplitude will dynamically create the query it will use to create Snowflake objects. To copy it to the clipboard, click Copy. You can now paste it into your Snowflake account.
- Click Next >. Amplitude will try to upload test data using the credentials you entered. If the upload is successful, click Finish.
All future events will automatically be sent to Snowflake.
From here, Amplitude generates micro-batch files at five-minute intervals and loads them to customer-owned Snowflake accounts directly every 10 minutes. You will be able to see the data in your Snowflake accounts within 20 minutes after Amplitude receives the events.
Export historical Amplitude data to Snowflake
To export your historical data from Amplitude into Snowflake, click Export Data and select a date range.
This process can take anywhere from a single day to several weeks, depending on your data volume, warehouse size, cluster count, network bandwidth, and number of concurrent historical data exports you currently have, among other factors.