The sole data warehouse created for cloud computing.
With Snowflake, you can quickly enable analytics, gather all your data, and provide data-driven insights to all of your business users.
❕ IMPORTANT: Snowflake can establish a network policy to impose access restrictions depending on IP addresses.
❕ IMPORTANT: Before moving on, be aware that you must authenticate with a user whose role has been given the proper privileges** for both Warehouses, Databases, Schemas, and Tables in order to set up your authentication appropriately.
You must have executed a series of instructions similar to the ones below before acting in the context of ACCOUNTADMIN within your Snowflake account dashboard:
The privileges are given to a warehouse, then to the databases, then to their schemas and specific tables.
To enable access to the tables, this must be done. You may configure the precise table permissions (select, insert, etc.) based on the functions you want the connector to do.
The initial step in using the Snowflake connector is to visit your account page and choose the appropriate process. Once within the workflow dashboard, look for and drag the Snowflake connector onto your workflow from the connectors panel (on the left).
Click on "New Authentication" in the properties panel on the right, which is shown under the "Settings" header, with the new Snowflake connector step underlined.
A pop-up window for the authentication will appear as a result. You will be asked to name your authentication correctly and indicate if it is personal or organizational on the first page.
You can notice that the following screen requests your "Account," "Username," "Password," and "Region" (though the latter is optional depending on how you fill the authentication in).
The 'Account' field in the authentication window has two possible entries.
The 'Account' details would be: ls28623.us-east-1 if the URL for your Snowflake instance is https://ls28623.us-east-1.snowflakecomputing.com. Meaning you can add the region (us-east-1) next to the account name within the account field itself.
The second method is to only enter the "Account" name from the URL, which is ls28623, in the account form and to enter the "Region" details, us-east-1, directly into the region field itself.
The images below show examples of both situations.
Your Snowflake account's username and password are the same as the fields labeled "Username" and "Password."
A default Warehouse must be configured for your authentication because it needs to be automatically retrieved when you are conducting Snowflake operations, even though the majority of these parameters are use case-specific.
In Snowflake lingo, a warehouse is a collection of computer resources you've designated to manage your databases and tables and which may be modified in accordance with the amount of processing power and storage space needed.
You can run queries and statements by directly accessing the table itself if you specified defaults for "Database" and "Schema":
If you don't set default values for "Database" and "Schema," you'll need to input them "fully-qualified," something like this: <db_id>.<schema_id>.<object_id>:
After clicking the "Create authentication" button, return to the authentication field in the workflow dashboard properties panel from before and choose the newly added authentication from the dropdown menu that appears.
Your connector authentication configuration ought to be finished at this point.
Notes on using Snowflake
Load data to table
Files are an input field for the procedure "Load data to Table."
Here, a list or array of filenames to load is needed.
It is not possible to stream files to a Snowflake data warehouse using this method.
Add rows to the table.
It's crucial to remember that any String/Varchar values in a table must be contained in single quotes (') when using the Insert rows in table action.
The examples that follow demonstrate the use of one or two of the available connector procedures.
For information on all of the actions this connector may do, please check the Full Operations Reference at the bottom of this page.
Here is an illustration of how you might use the Snowflake connection to read all the records from the Snowflake table and put new records into it.
When records are received, the process determines whether the table in the Snowflake database to which we wish to add them already exists. The workflow first generates a new table if one doesn't already exist before adding the received records to it. The procedure then continues to read the entries from the chosen Snowflake table after the records have been added.
The operations of the Snowflake connector shown in this example are as follows:
- Executes a raw, specified SQL query on the selected warehouse using a custom query.
- The combination of your chosen warehouse, database, table schema, and table is used to insert one or more rows into the table.
- List table rows: Locates and reads the rows using the combination of your preferred warehouse, database, table schema, and tables.
The actions are as follows:
- Pull the client records from the source (in this case, a webhook trigger), and if the table isn't already there, construct it in the Snowflake database.
- Using the Loop connector, insert each record one at a time into the Snowflake database.
- Read the Snowflake table's entries.
Your entire process should resemble this:
Pull records & make a table
Choose the Webhook trigger from the list of possible triggers, and then set the operation to "When a webhook is received, automatically react with HTTP 200." Before moving forward, make sure to click the "Enable" button. As a result, the process is prepared to take incoming calls.
The following information was received by the webhook and is in JSON format:
"name": "Renaldo Castaignet",
"name": "Arlen Grumbridge",
"name": "Veronika Fitter",
"name": "Tirrell Gladbach",
"name": "Emmalynne Gibbard",
Keep in mind that depending on where the data is coming from, the format in which you receive it will change. For example, retrieving records from Salesforce, Webform, etc., will produce different results/views.
Next, add the Snowflake connector and select "Custom query" for the operation. In the "Query" field, enter the following query.
Create table if not exists client_info (client_id int, name varchar, email varchar, contact_no varchar)
The aforementioned query will examine the Snowflake database to see if the requested table (in this case, client_info) is there. The process continues if it does. If not, it constructs the table using the data from the table specified in the aforementioned SQL query.
2. Add records
Add a Loop connection with "Loop List" as the operation and "List" set to $.steps.trigger.body in the "List" field. The connector-snake can be used to generate this automatically.
In the Loop connector, add a Snowflake connector after that, and set the action to "Insert rows in table."
Put the name of the table you made in the previous step—in this case, client_info—in the "Table" field.
🛈 There is no need to hard-code the word "Table." For instance, you might have retrieved it via your trigger's custom data field $.trigger.table or $.trigger.snowflake table.
To add a single "Row," click the "Add to Rows" button located under the "Rows" box. Using the 'Add more than one item to Data' link, you can add additional pieces of data to this Row.
Here, choose a number. The figure indicates how many columns your chosen table contains. Based on the query used in the first 'Create table if not exists' phase, you can see that the client info table in this example includes four fields. Click the "Add" button when finished.
Four "Column" and "Value" pairs are now accessible under the "Column entry" section. One by one, enter the names of each "Column" in the "Column" box. The client_id, name, email, and contact_no. columns have all been inserted, as seen in the image below.
Snowflake will only accept String/Varchar values that are contained in single quotes,
This is easy if you are hardcoding string values.
Simply enclose it in single quotes, as in the image below, to supply the value Tirrell Gladbach for the "name" field.
Instead of hardcoding the values, we are dynamically feeding them in as jsonpaths and inserting the webhook's data one by one into the Snowflake database by looping through it.
We must use interpolated mode in order to accomplish this with the requisite ‘ ‘.
To determine the jsonpath for each column from the loop step, we can utilize the connector-snake. The type selector for each 'Value' field is automatically set to 'jsonpath' when the jsonpath is provided:
For all "Value" fields that have a column data type of String or Varchar, follow the procedures below. You can tell that the first step's create table query used a "string" data type for the columns "name," "email," and "contact no."
- Reset the type selection for each of the aforementioned fields to "String." With this, the whole jsonpath for the specific "Value" field will be displayed. If we look at the name column's jsonpath, it should like $.steps.loop-1.value.name.
2. We now need to enclose this jsonpath in single quotes. To do this, add single quotes at both ends of the jsonpath and encapsulate it in open and close curly brackets. The output must resemble this: "$.steps.loop-1.value.name".
1. To change the emphasis to another area of the builder, click there. When you've finished, the jsonpath should appear as it does in the picture below.
Be sure to follow the same procedures for all fields if the column data type is String or Varchar.
3. Read records
As a further step, add a second Snowflake connector outside the loop, and choose the operation to be "List table rows."
Put the name of the table, client_info, that you created in the first step and added the entry to in the previous step in the "Table" column.
All of the rows from the selected table will be listed by the operation List table rows in this stage. This will guarantee that the chosen table receives the intended data.
The final step's records should resemble those below:
[To be created]