Data Engineering
Blog
Snowflake
6
min read

Loading data from local environments into Snowflake

Discover how to load structured data from a computer into Snowflake.
Author
Tamasine Key
Tamasine Key
Data & AI Engineer
Loading data from local environments into Snowflake
Share article

In this blog post, we are going to learn how to load structured data from our computer into Snowflake.

Loading local data into Snowflake: before you begin

  1. Set up data warehouse & database: Make sure the data you are about to import has a “Home” within Snowflake, that being data warehouse, database, and schema being set up. If you’re not sure how to create those, check out our previous blog post on Snowflake 101: Setting Up Environment and Database
  2. File format: The format of our file must be suitable for Snowflake. See the documentation from Snowflake below to make sure that your file type is supported.

Supported Snowflake files

Download this file to follow along: Superstore_Sales.csv

Step 1: Create Table

Create table in Snowflake

  1. Drill into your database by clicking the database name. In case you’re following our previous tutorial, this would then be RETAIL_DB_SALES.
  2. From the table tab, click Create
  3. When the Create Table dialog box opens, put in the Table Name as SUPERSTORE_SALES
  4. Leave the Schema Name as SALES_DATA
  5. Provide a short Comment about the data, such as “Sales per Category”
  6. In the column section of the dialog, click + sign to add a new column detail. Check your data and adjust the type accordingly. If you expect the column to never be Null, then click on the Not Null column
  7. Finish

As an alternative to the UX, you can also create the table using query in the SQL Query Pane by running the code below:


USE WAREHOUSE SNOWFLAKE_WH;
USE DATABASE Retail_DB_Sales;
USE SCHEMA Sales_Data;

CREATE TABLE "RETAIL_DB_SALES"."SALES_DATA"."SUPERSTORE_SALES"
("Category" STRING,
"Customer Name" STRING,
"Order ID" STRING NOT NULL,
"Postal Code" INTEGER,
"Product Name" STRING NOT NULL,
"Quantity" INTEGER NOT NULL,
"Sales" INTEGER NOT NULL,
"Segment" STRING,
"Sub-Category" STRING)
COMMENT = 'Superstore Sales per Category';

Step 2: Import CSV from Local Drive

Load local file Snowflake

  1. From the breadcrumb trail, make sure that you are still in SUPERSTORE_SALES(SALES_DATA) table
  2. From the table tab, click Tables
  3. Click Load Table. You can see from the pop-up that there are four steps here being Warehouse, Source Files, File Format, and Load Options
  4. Warehouse (to carry out the work): select SNOWFLAKE_WH -> Next
  5. Source Files (to identify the file we want to load): Select Load Files From Your Computer -> Next
  6. File Format (provide details of our file format): Click on the + sign
  7. Now go back to Tables option, make sure that in the breadcrumb trail, you see Databases > RETAIL_DB_SALES > SUPERSTORE_SALES (SALES_DATA). If not, select SUPERSTORE_SALES data from the table list. Then choose Load Table.
  8. You may now navigate to the RETAIL_DB_SALES database in the Navigation Pane and see that the SUPERSTORE_SALES table is there. You can also run a simple SELECT * statement or click Preview Data in the Preview Pane to view the data you’ve loaded.


In the Create File Format dialog box, provide the information below:

  • Name: Sales_Data_Comma
  • Column separator: Comma
  • Header lines to skip: 1
  • You can change other options as you’d like but for now, we will leave the remaining by default
Check file format Snowflake
  • Finish

You can also replace this whole step 6 with this piece of code below:

USE WAREHOUSE SNOWFLAKE_WH;
USE DATABASE Retail_DB_Sales;
USE SCHEMA Sales_Data;

CREATE FILE FORMAT SALES_DATA_COMMA
TYPE = 'CSV'
COMPRESSION = 'AUTO'
FIELD_DELIMITER = 'Comma'
RECORD_DELIMITER = '\n'
SKIP_HEADER = 1
FIELD_OPTIONALLY_ENCLOSED_BY = 'NONE'
TRIM_SPACE = FALSE
ERROR_ON_COLUMN_COUNT_MISMATCH = TRUE
ESCAPE = 'NONE'
ESCAPE_UNENCLOSED_FIELD = '\134'
DATE_FORMAT = 'AUTO'
TIMESTAMP_FORMAT = 'AUTO'
NULL_IF = ('\\N');

Snowflake overview of Superstore data

The Load Data will pop-up as below. Follow these steps below:

  1. Warehouse: choose SNOWFLAKE_WH
  2. Source File: choose Load Files from your Computer, then Load the Superstore_Sales.csv you downloaded in the beginning of this tutorial
  3. File Format: choose SALES_DATA_COMMA from the drop down
  4. Load Options: select the second option (Stop loading, rollback, and return the error)
  5. Load.
Load data within Snowflake

This is the pop-up you will see once the data is successfully loaded.

Snowflake load results

If your file does not load and you don’t understand the issue, you may want to try some of the other Load Options from step d. You can choose the option “Continue loading...” that would let you see which rows successfully loaded and allow you to see which rows did not.

Final Snowflake result

Congrats! You’ve made it so far and that means that you are ready to tackle the next challenge on how to Import a CSV from Cloud storage.

Want more? Learn how to load structured data from AWS S3 into Snowflake here.

Facts & figures

About client

Testimonial

Blogs you might also like

7 Things You Should Know About NULL Values

7 Things You Should Know About NULL Values

Having troubles with NULL values? Here are 7 things you should know about them.

Data Engineering
Blog
Tableau
Tableau <> Snowflake key-pair authentication

Tableau <> Snowflake key-pair authentication

Discover how you can use key-pair authentication to connect Tableau to Snowflake.

Data Engineering
Blog
Snowflake
Snowflake 101: Loading cloud data using AWS

Snowflake 101: Loading cloud data using AWS

Let's discover how to load structured data from Cloud using AWS S3 into Snowflake.

Data Engineering
Blog
Snowflake
How to pass the SnowPro Core certification exam

How to pass the SnowPro Core certification exam

Get a hands-on personal take on the SnowPro Core certification and how you should prepare for it. We're sure you'll ace the exam!

Data Engineering
Blog
Snowflake
How to UPSERT or MERGE data with Tableau Prep’s write-back functionality

How to UPSERT or MERGE data with Tableau Prep’s write-back functionality

A demonstration of how to simply apply the straightforward concept of MERGE and UPSERT in Tableau Prep.

Data Engineering
Blog
Tableau
dbt Configuration: YAML file

dbt Configuration: YAML file

Learn the basics of configuring your YAML files for dbt the right way.

Data Engineering
Blog
dbt
Improving your Data Quality in 7 Steps

Improving your Data Quality in 7 Steps

Want to improve your data quality? Learn how to improve data quality in 7 steps here. Read the full article.

Data Engineering
Blog
Why automate your data pipelines?

Why automate your data pipelines?

Thinking of building your own data pipelines? This article explains why that's not always the best option.

Data Engineering
Blog
Fivetran
Using dbt to model GA4 raw data

Using dbt to model GA4 raw data

Learn how to leverage dbt to model GA4 raw data for in-depth analysis and insights.

Data Engineering
Blog
dbt
What is Salesforce Data 360: The Ultimate Guide

What is Salesforce Data 360: The Ultimate Guide

Discover how Salesforce Data 360 unifies customer data, enhances decision-making, and powers AI-driven innovations.

Data Engineering
Blog
Data Cloud
Snowflake vs Salesforce Data 360

Snowflake vs Salesforce Data 360

Discover how Snowflake and Salesforce Data Cloud can complement each other to create an integrated, scalable, and actionable data strategy.

Data Engineering
Blog
Data Cloud