Snowflake integration makes it easy to load Insightful’s data into a Snowflake data warehouse. When you integrate Snowflake with Insightful, you get a fully managed data pipeline loaded into a powerful and cost-effective data warehouse.
Insightful runs a periodic process to pull data related to your organization, teams, and employees from our database and load them into your Snowflake cluster.
This integration is a Premium add-on.
Getting Started
To create the integration in Insightful, please follow these instructions:
Once you log in to your admin’s dashboard, select Settings → Integrations → Data Warehouse and search for Snowflake.
Data warehouse feature is disabled by default. Click on Request access and allow our team up to one working day to enable it for your organization.
Once the feature is enabled, click on Configure Integration.
Create the integration in Insightful
To successfully connect your Snowflake account to Insightful, you’ll need to have following details prepared first. Make sure you have either:
ACCOUNTADMIN
andSECURITYADMIN
roles/permissions on Snowflake
orA JSON file with login details, provided by your admin or IT team.
In the setup form, you’ll enter your Snowflake connection details. Here’s what you’ll need and how to get each piece:
1. Account
In the Snowflake dashboard:
Go to Account details
Copy the Account Identifier (e.g.,
ABCDE-XY12345
) into the field.
2. Username
Use the Snowflake username chosen at signup or provided by your admin (e.g., TEST.USER
).
3. Database
If you don’t already have one, open the SQL worksheet in Snowflake and run:
CREATE DATABASE database_name;
Replace database_name
with your chosen name, and then use it in the form.
4. Schema
By default, there’s a PUBLIC
schema under your database.
If you prefer a new schema, run:
CREATE SCHEMA database_name.my_schema;
Replace database_name
with your database name and my_schema
with your preferred schema name.
5. Stage Name
After Database and Schema have been created, create a stage by running following SQL command in Snowflake:
CREATE STAGE schema_name.stage_name;
Replace schema_name
and stage_name
as needed.
6. Warehouse
To create a virtual warehouse, run following SQL command in Snowflake:
CREATE WAREHOUSE my_warehouse;
Replace my_warehouse
with your chosen name.
7. Private Key
To securely connect Insightful to Snowflake, you’ll need to generate an RSA key pair:
7.1. Generate RSA Key Pair
Run the following commands in your terminal one by one (macOS/Linux/WSL/Git Bash):
# Generate private key
openssl genpkey -algorithm RSA -out private_key.pem -pkeyopt rsa_keygen_bits:2048
# Extract public key
openssl pkey -in private_key.pem -pubout -out public_key.pem
7.2. Add Public Key to Snowflake User
Log in with a user that has SECURITYADMIN
role and run:
ALTER USER "User.1" SET RSA_PUBLIC_KEY='<your-public-key>';
Replace
"User.1"
with your Snowflake username.Replace
<your-public-key>
with the single-line string, no line breaks, frompublic_key.pem
file
Replace "User.1" with your actual Snowflake username.
Replace <your-public-key> with the content of your public_key.pem file (single-line string, no line breaks).
7.3. Copy Private Key to the App
Open private_key.pem
in a text editor (e.g. VS Code, Notepad, Sublime).
Copy the entire content (including -----BEGIN PRIVATE KEY-----
and -----END PRIVATE KEY-----
) and paste it into the Private Key field in the app.
Alternative Setup via JSON (for IT/Admins)
Instead of filling out the form manually, your IT or Snowflake Admin can provide you with a pre-filled JSON file. This file can be uploaded using the Upload JSON button to automatically populate all required fields.
Example of JSON file:
{
"account": "your_account_id",
"username": "YOUR.USERNAME",
"authenticator": "SNOWFLAKE_JWT",
"privateKey": "-----BEGIN PRIVATE KEY-----\n...key contents...\n-----END PRIVATE KEY-----",
"warehouse": "YOUR_WAREHOUSE",
"database": "YOUR_DATABASE",
"schema": "YOUR_SCHEMA",
"integrationType": "snowflake",
"stage": "YOUR_STAGE"
}
Synchronization Form
In the Connection name field, enter the desired name of the integration.
By clicking Sync Start Date, you can select the day from which data should be synced:
For data stored in Elasticsearch, you can sync data up to one year in the past.
For fragments, sync is limited to data from the last three months on active nodes only. A tooltip explains: “Data sync can’t be older than 3 months.” Frozen nodes are excluded.
For data stored in the Mongo database, the Sync Start Date is ignored — the system will pull all available datastarting from the organization’s creation date.
Example: On June 25th, nearly three months of fragment data can be synced; by July 1st, this drops to just over two months, as fragments from May move to frozen nodes.
FAQ
How does Snowflake pricing work?
Snowflake pricing is based primarily on compute and storage. You pay for the time your virtual warehouse (computer) is running, billed per second, with a minimum of 60 seconds. Compute cost depends on the warehouse size (e.g., X-Small, Small, Medium) and usage duration. To optimize costs, you can configure auto-suspend (to pause when idle) and auto-resume (to start only when needed). For full details, see:
How do I query my data in BigQuery?
You can connect a BI tool like Mode or Looker to Snowflake, or query directly from the Snowflake SQL console (Worksheets in the menu).
Can I customize my sync schedule?
We have a few sync schedules that you can choose from:
Hourly (every hour at 00:00)
Daily (every day starting at 00:00 UTC)
Weekly (every Sunday at 00:00 UTC)
In case of a failed sync, new attempt will be made every 15 minutes, until a successful sync has happened. You are not able to define your own schedule, other than the three options mentioned above. Please note that each type of sync may have up to 15 minutes delay of sync start.