paint-brush
A Streamlined Workflow for Forwarding Customer Data to Snowflakeby@mparticle
121 reads

A Streamlined Workflow for Forwarding Customer Data to Snowflake

by mParticleFebruary 1st, 2022
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Connecting your CDP to your Data Warehouse is a great way to simplify the flow of your customer data and eliminate overhead for your engineering team. Here's how you can set up data forwarding from mParticle to Snowflake.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - A Streamlined Workflow for Forwarding Customer Data to Snowflake
mParticle HackerNoon profile picture


CDPs have been widely adopted by product, marketing, and data teams due to their ability to collect customer data with a single API, resolve user identities across channels, and activate this data through direct integrations with downstream systems. Many teams that are currently benefiting from a CDP also have a data warehouse in their stack to serve analytics, storage and business intelligence use cases.


In addition to offering marketing and product teams a way to easily activate customer data, CDPs can also serve as an easy way for teams to move customer data into a DWH. Taking advantage of a CDP’s integration with a DWH eliminates the need for developers to set up direct integrations with the DWH, giving engineering teams one less integration to maintain. Additionally, it lightens the burden of third-party code that gets shipped to applications and websites. And finally, it gives non-technical teams access to the same data stream that is being loaded into the DWH without engineering support.


In this use case, we will explore how to collect eCommerce customer data with mParticle, forward this data from mParticle to Snowflake, then query the data to retrieve targeted customer segments for analysis and activation.

Step 1: Integrate mParticle’s SDK into your eCommerce client

The first step is to implement event collection code throughout your eCommerce app using mParticle’s SDKs. When doing this, we recommend starting with a Data Plan. This blog post covers best practices for aligning stakeholders on developing a data plan using mParticle’s Data Plan Builder.

Step 2: Set up your warehouse, database, and schema in Snowflake

Next, you will need to instantiate a warehouse, database and schema within Snowflake where you will eventually forward your eCommerce data. You can easily accomplish this by running some SQL commands within a Snowflake Worksheet:


https://www.youtube.com/watch?v=Yh-wAcGrBx4

Step 3: Create roles and manage user permissions

Now you’ll need to create a role with permissions to manage the database you just created within Snowflake, again by running direct SQL commands:


https://www.youtube.com/watch?v=NIl6Sk32o80

Step 4: Enable the Snowflake integration within mParticle

In your mParticle workspace, select “Data Warehouses” under “Setup” in the navigation menu. Select Snowflake, then click on the plus icon to name and add a new Snowflake configuration.


With the “Settings” tab of your Snowflake configuration, enter the database, warehouse, and schema names you created in Snowflake, along with the user ID and password you set for your dedicated user. Within this “Settings” screen, you’ll also supply an event threshold, which tells Snowflake to create a dedicated table once it receives a specified number of events of the same type, as well as a loading delay, which lets you adjust how often you want to forward your event batches to Snowflake.


Finally, connect individual inputs in mParticle to your Snowflake output from within the Connections page. Since we will be forwarding events from the eCommerce app that we connected to mParticle in step 1, you will select the “Web” input here.


https://www.youtube.com/watch?v=vghRsAuYSYo


Step 5: Verify your data connection in Snowflake

Once Snowflake has received its first event batch from mParticle, you will see this data represented as tables and views within your Snowflake schema.


Snowflake will represent data from mParticle as "Tables" and "Views"

Step 6: Query your data to retrieve targeted audiences

Now that our eCommerce data is being forwarded to Snowflake, we can write SQL queries directly within our Snowflake Worksheet to retrieve targeted customer segments. In this example, let’s assume that our product team has an educated hypothesis that customers who select red couches have a higher than average likelihood to make repeat purchases. Since we are logging an “Add Customization–Color” event in mParticle and forwarding these events to Snowflake, we can easily pull this audience with a simple SQL query.


https://www.youtube.com/watch?v=ngsmU-IBTIY


Excellent! Since we were capturing email as a user attribute along with these events, we can now export this targeted audience as a CSV that will allow our marketing team to contact these customers with targeted offers. Additionally, since we more than likely have customer email identifiers in data sets from other sources, we have the option of running JOIN statements to combine this data set with information from other sources stored across our Data Warehouse.

Watch an end-to-end demo of this use case:

https://www.youtube.com/watch?v=ztL13IsGbM0