Particle to Snowflake

This page provides you with instructions on how to extract data from Particle and load it into Snowflake. (If this manual process sounds onerous, check out Stitch, which can do all the heavy lifting for you in just a few clicks.)

About Particle

Particle allows its users to bring their Internet of Things (IoT) products to market faster. They provide a secure, easy-to-use, full-stack IoT cloud platform and low-cost connected hardware.

About Snowflake

Snowflake is a data warehouse solution that is entirely cloud based. It's a managed service. If you don't want to deal with hardware, software, or upkeep for a data warehouse you're going to love Snowflake. It runs on the wicked fast Amazon Web Services architecture using EC2 and S3 instances. Snowflake is designed to be flexible and easy to work with where other relational databases are not. One example of this is the query execution. Snowflake creates virtual warehouses where query processing takes place. These virtual warehouses run on separate compute clusters, so querying one of these virtual warehouses doesn't slow down the others. If you have ever had to wait for a query to complete, you know the value of speed and efficiency for query processing.

Getting data out of Particle

The first step for getting your Particle data into Redshift is collecting that data from Particle’s servers. Particle exposes events through webhooks. To use webhooks, log into your Particle console and click on the Integrations tab, then click New Integration > Webhook. Set the event name to the item you want to track; it’s good practice to specify the name of the field where you want the data to live in Redshift. Set the URL to the key or token that Redshift will use to accept the data. Leave the request type as POST. In the device field, select the device you want to trigger the webhook. Finally, click Create Webhook.

Sample Particle Data

Now Particle will send data via the webhook through a POST request whenever an event triggers it to do so. Data will be enclosed in the body of the request in JSON format. The fields and endpoints will match the data collected by your form. For instance:

{ "event": [event-name], "data": [event-data], "published_at": [timestamp], "coreid": [device-id] }

Preparing data for Snowflake

Depending on the structure that you data is in, you may need to prepare it for loading. Take a look at the supported data types for Snowflake and make sure that the data you've got will map neatly to them. If you have a lot of data, you should compress it. Gzip, bzip2, Brotli, Zstandard v0.8 and deflate/raw deflate compression types are all supported.

One important thing to note here is that you don't need to define a schema in advance when loading JSON data into Snowflake. Onward to loading!

Loading data into Snowflake

There is a good reference for this step in the Data Loading Overview section of the Snowflake documentation. If there isn’t much data that you’re trying to load, then you might be able to use the data loading wizard in the Snowflake web UI. Chances are, the limitations on that tool will make it a non-starter as a reliable ETL solution. There two main steps to getting data into Snowflake:

  • Use the PUT command to stage files
  • Use the COPY INTO table command to load prepared data into the awaiting table from the prior step.

For the COPY step, you’ll have the option of copying from your local drive, or from Amazon S3. One of Snowflakes’ slick features lets you to make a virtual warehouse that will power the insertion process.

Easier and faster alternatives

If all this sounds a bit overwhelming, don’t be alarmed. If you have all the skills necessary to go through this process, chances are building and maintaining a script like this isn’t a very high-leverage use of your time.

Thankfully, products like Stitch were built to solve this problem automatically. With just a few clicks, Stitch starts extracting your Particle data via the API, structuring it in a way that is optimized for analysis, and inserting that data into your Snowflake data warehouse.