CDPs have been widely adopted by product, marketing, and data teams due to their ability to collect customer data with a single API, resolve user identities across channels, and activate this data through direct integrations with downstream systems. Many teams that are currently benefiting from a CDP also have a data warehouse in their stack to serve analytics, storage and business intelligence use cases.
In addition to offering marketing and product teams a way to easily activate customer data, CDPs can also serve as an easy way for teams to move customer data into a DWH. Taking advantage of a CDP’s integration with a DWH eliminates the need for developers to set up direct integrations with the DWH, giving engineering teams one less integration to maintain. Additionally, it lightens the burden of third-party code that gets shipped to applications and websites. And finally, it gives non-technical teams access to the same data stream that is being loaded into the DWH without engineering support.
In this use case, we will explore how to collect eCommerce customer data with mParticle, forward this data from mParticle to
The first step is to
Next, you will need to instantiate a warehouse, database and schema within Snowflake where you will eventually forward your eCommerce data. You can easily accomplish this by running some SQL commands within a
https://www.youtube.com/watch?v=Yh-wAcGrBx4
Now you’ll need to create a role with permissions to manage the database you just created within Snowflake, again by running direct SQL commands:
https://www.youtube.com/watch?v=NIl6Sk32o80
In your mParticle workspace, select “Data Warehouses” under “Setup” in the navigation menu. Select Snowflake, then click on the plus icon to name and add a new Snowflake configuration.
With the “Settings” tab of your Snowflake configuration, enter the database, warehouse, and schema names you created in Snowflake, along with the user ID and password you set for your dedicated user. Within this “Settings” screen, you’ll also supply an event threshold, which tells Snowflake to create a dedicated table once it receives a specified number of events of the same type, as well as a loading delay, which lets you adjust how often you want to forward your event batches to Snowflake.
Finally, connect individual inputs in mParticle to your Snowflake output from within the Connections page. Since we will be forwarding events from the eCommerce app that we connected to mParticle in step 1, you will select the “Web” input here.
https://www.youtube.com/watch?v=vghRsAuYSYo
Once Snowflake has received its first event batch from mParticle, you will see this data represented as tables and views within your Snowflake schema.
Now that our eCommerce data is being forwarded to Snowflake, we can write SQL queries directly within our Snowflake Worksheet to retrieve targeted customer segments. In this example, let’s assume that our product team has an educated hypothesis that customers who select red couches have a higher than average likelihood to make repeat purchases. Since we are logging an “Add Customization–Color” event in mParticle and forwarding these events to Snowflake, we can easily pull this audience with a simple SQL query.
https://www.youtube.com/watch?v=ngsmU-IBTIY
Excellent! Since we were capturing email as a user attribute along with these events, we can now export this targeted audience as a CSV that will allow our marketing team to contact these customers with targeted offers. Additionally, since we more than likely have customer email identifiers in data sets from other sources, we have the option of running JOIN statements to combine this data set with information from other sources stored across our Data Warehouse.
https://www.youtube.com/watch?v=ztL13IsGbM0