Data Engineering
Blog
dbt
5
min read

How to Pass the dbt Analytics Engineering Certification

A structured 4-week study plan to pass the dbt Analytics Engineering Certification (exam version 1.7), with prerequisites, phases, and exam topics.
Author
Arend Verschueren
Arend Verschueren
Head of Marketing & RevOps
How to Pass the dbt Analytics Engineering Certification
Share article

Update for 2026: This post was originally written when the dbt certification was on version 1.1 and only available since 2022. Since May 2024, the exam has been on version 1.7, and the official dbt Labs study guide is now on v9.0. Metrics, exposures, and Python models — which were not in scope when our consultant sat the exam — are now part of the curriculum. The study plan below has been updated for the current exam.

So you want to get dbt certified by acing the dbt Analytics Engineering Certification Exam. Good call. This is our team's experience preparing for and passing it, written up to give you a head start.

The short version: with the right prerequisites (SQL fluency and ideally 6+ months of hands-on dbt) you can pass it in roughly 3 to 4 weeks of focused study, working through 6 structured phases that take you from the dbt Fundamentals course all the way to memorising the YAML reference pages.

What is the dbt Analytics Engineering Certification?

The dbt Analytics Engineering Certification is the official credential from dbt Labs that validates your ability to build, test, document, and maintain data transformation pipelines using dbt. The current exam (version 1.7) covers everything from Jinja and macros to incremental models, generic and singular tests, source freshness, environments, jobs, and CI/CD with dbt.

What makes this certification distinctive is that it does not certify dbt the tool in isolation. It certifies the broader role of the Analytics Engineer. The exam asks questions about SQL best practices, git workflows, and data modelling patterns alongside the dbt-specific material, because those skills are what the analytics engineer actually uses on the job.

Our preferred analogy: spending years cutting down trees with an axe (executing .sql files manually from a workbench) makes you pretty good at it. Then someone hands you a chainsaw (dbt). It takes a few weeks to get the hang of it, but you immediately understand how this thing is going to change your day-to-day work for the better. The certification, in a sense, is the official chainsaw safety course.

For context on what dbt actually does, our overview of dbt as a data engineering solution covers the fundamentals. The rest of this post assumes you have decided you want to certify and need a plan.

Prerequisites: where you should be before you start

The official dbt Labs recommendation is SQL proficiency plus at least 6 months of dbt experience (Core or Cloud). Our experience suggests adding Python and git to that list. Here is a realistic starting baseline based on what's worked for the Biztory consultants who have certified:

  • SQL: fluent, with roughly 3 years of intensive daily use across multiple dialects (MySQL, Oracle, Redshift, etc.).
  • Python: about 12 months as an ETL tool. Junior-level, but enough to recognise the basic concepts.
  • dbt: zero prior experience is workable, though more helps.
  • git: a 2-day introduction course is the minimum.

More importantly, prior experience in the role dbt describes as the Analytics Engineer makes a real difference. Because the exam is built around that role, the problems dbt "solves" feel native to people who have done analytics engineering work rather than abstract.

If your dbt experience is patchy, consider working through a real project before you start studying. Biztory's dbt training or a dbt QuickStart engagement will both close that gap faster than studying alone.

The 6-phase study plan (3 to 4 weeks)

The biggest challenge with this exam is its blackbox nature. No practice exams are available, and the "official" documentation from dbt is sprawling (docs, references, guides, blogs, tutorials). The phases below are the order our team has found most effective, and they map cleanly to the dbt Labs study guide v9.0.

Phase 1: Take the dbt Fundamentals course (~1 week)

Start with the dbt Fundamentals course. It is free.

Take your time with this one. dbt says the course is "about 5 hours," but that covers video length only. Actually doing the exercises and getting comfortable with the concepts takes most of a week if you are new.

The recommended approach: follow along with the official jaffle-shop dataset tutorial in dbt Cloud + Snowflake, then build your own project in dbt Core with a random dataset from the web uploaded to Snowflake. That second step matters: building something yourself, from scratch, is what makes the concepts stick.

Phase 2: Follow the official Learning Path (~1 to 2 weeks)

The dbt Labs study guide includes a full Learning Path that takes you through every relevant course. This phase takes the longest. By this point, start building your summary document (more on that below) — every course you complete, every doc you read, every nuance you pick up, write it down in your own words.

The Learning Path covers:

  • dbt Fundamentals (you already did this)
  • Jinja, Macros, Packages
  • Advanced Materialisations
  • Analyses and Seeds
  • Refactoring SQL for Modularity
  • Advanced Deployment with dbt Cloud
  • Advanced Testing

Phase 3: Read through the dbt documentation

Specifically the docs and reference sections. The goal is deeper, more nuanced information on the topics covered in the courses. This is also a great recap exercise to consolidate what you learned in Phase 2.

Pay particular attention to:

  • Materialisations (table, view, incremental, ephemeral, materialized view)
  • Jinja and macros
  • Sources, tests, snapshots, seeds
  • Configurations (project, model, source)
  • Hooks (the four types and when each fires)

Phase 4: Read selected guides

The guides are where the practical wisdom lives. The ones our consultants found most useful:

  • How we structure our dbt projects (staging → intermediate → marts)
  • Debugging errors in dbt (the error types and how to fix them)
  • Debugging schema names (custom schemas, concatenation behaviour)
  • Migrating from DDL, DML and stored procedures

Phase 5: Read selected blog posts

A few dbt Labs blog posts come up regularly in exam-prep discussions:

  • Your essential dbt project checklist
  • How to review an analytics pull request
  • The exact GitHub pull request template we use at dbt Labs

Phase 6: Study the reference pages (memorise the YAML)

The exam does not let you look anything up. You should be able to write a source.yml file from memory — not every property, but the most relevant ones from the courses.

Key reference pages to know cold:

  • dbt_project.yml structure
  • Configs and properties (and where they can be defined)
  • dbt Jinja functions (ref, source, var, env_var, target, this)
  • Source properties and configurations
  • profiles.yml structure

For a primer on YAML structure specifically, our post on configuring YAML files for dbt covers the patterns you need to know.

Build your summary document

By the start of week four you should have a substantial summary document, categorised under the 8 topics that the dbt study guide says the exam focuses on. This document is the single most valuable artefact of the entire study process.

Two things make it work:

  1. Write everything in your own words. Copy-pasting doesn't stick. Restating a concept in your own language forces you to actually understand it.
  2. Organise by exam topic, not by source. The exam tests across topics, not across courses. If your notes are organised "everything from course 3," you will not be able to find anything when you need to revise materialisations across all sources.

By this point in the timeline, plan for full study-like-you-are-back-at-university mode. Goodbye Netflix evenings, hello flashcards.

Topics to review before the exam

Since dbt does not allow sharing specific exam content, what follows is a generic list of topics our team remembers being asked about. Use this as a final-days revision aid, not as a guaranteed exam map.

Disclaimer: The list below is based on personal experience from a Biztory legend, Michiel, who has sat the exam.

Unfortunately, Michiel no longer works at Biztory, and since he wrote this post, the exam has been updated multiple times since (the current version is 1.7). Use this as supplementary guidance, not as your single source of truth. The official dbt Labs study guide (currently v9.0) is the canonical reference.

Topic to review before taking the dbt analytics engineering exam
Part to of things to review before taking the dbt exam

Common configurations cheat sheet

These configurations come up often enough to be worth memorising:

dbt analytics engineering exam configurations cheat sheet

Frequently Asked Questions

How long does it take to study for the dbt Analytics Engineering Certification?

Around 3 to 4 weeks of focused study if you already have SQL fluency and some dbt exposure. If you are new to dbt entirely, plan for 6 to 8 weeks, including time to build a real dbt project end to end. The official dbt Labs recommendation is at least 6 months of dbt experience (Core or Cloud) before sitting the exam, but you can compress that with structured project work alongside the study plan.

What is the passing score for the dbt Analytics Engineering Certification?

The passing score is 65%. The exam is 65 questions and you have 2 hours to complete it. Questions are a mix of multiple-choice (one correct answer) and multi-select (choose all that apply), and the multi-select questions are generally where people lose the most marks. Check the official exam page for the most current details, as dbt Labs updates the exam format periodically.

Is the dbt Analytics Engineering Certification worth it?

For analysts and data engineers moving into analytics engineering, yes. The certification is a strong CV signal for that specific role, and the study process exposes you to areas of dbt you probably haven't used in production. For experienced dbt practitioners with a strong shipping record, the credential is a nice-to-have rather than a career mover, but most candidates find the structured study process valuable on its own.

Conclusion

The dbt Analytics Engineering Certification rewards structured preparation more than raw dbt experience. The candidates who pass it most cleanly are not necessarily the ones with the most years on the tool — they are the ones who treat the study process as its own project. Three to four weeks of focused work, a summary document organised by exam topic, and discipline around the reference pages will get you through.

If you want help building a stronger dbt foundation before you sit the exam, our dbt training and dbt QuickStart engagements are both designed to get you from zero to confident faster than self-study alone. Or if you want to talk through your own learning path, get in touch with the Biztory team.

Good luck with the exam.

Facts & figures

About client

Testimonial

Blogs you might also like

dbt packages every project needs
Arrow icon darkArrow icon dark

dbt packages every project needs

Stop reinventing macros. Discover 9 essential dbt packages that solve real problems.

Data Engineering
Blog
dbt
Snowflake vs Databricks
Arrow icon darkArrow icon dark

Snowflake vs Databricks

Snowflake vs Databricks compared — architecture, pricing, ML capabilities, and use cases. Find out which cloud data platform fits your team.

Data Engineering
Blog
Snowflake
Snowpark Connect for Apache Spark
Arrow icon darkArrow icon dark

Snowpark Connect for Apache Spark

Optimize workflows with Snowpark Connect for Apache Spark. Run Spark code directly in Snowflake to lower costs and simplify your architecture

Data Engineering
Blog
Snowflake
The role of semantics in agentic analytics
Arrow icon darkArrow icon dark

The role of semantics in agentic analytics

Learn more about the role of semantics in agentic analytics and how it drives clear, consistent autonomous insights.

Data Engineering
Blog
7 Things You Should Know About NULL Values
Arrow icon darkArrow icon dark

7 Things You Should Know About NULL Values

Having troubles with NULL values? Here are 7 things you should know about them.

Data Engineering
Blog
Tableau
Tableau <> Snowflake key-pair authentication
Arrow icon darkArrow icon dark

Tableau <> Snowflake key-pair authentication

Discover how you can use key-pair authentication to connect Tableau to Snowflake.

Data Engineering
Blog
Snowflake
Snowflake 101: Loading cloud data using AWS
Arrow icon darkArrow icon dark

Snowflake 101: Loading cloud data using AWS

Let's discover how to load structured data from Cloud using AWS S3 into Snowflake.

Data Engineering
Blog
Snowflake
Loading data from local environments into Snowflake
Arrow icon darkArrow icon dark

Loading data from local environments into Snowflake

Discover how to load structured data from a computer into Snowflake.

Data Engineering
Blog
Snowflake
How to pass the SnowPro Core certification exam
Arrow icon darkArrow icon dark

How to pass the SnowPro Core certification exam

Get a hands-on personal take on the SnowPro Core certification and how you should prepare for it. We're sure you'll ace the exam!

Data Engineering
Blog
Snowflake
How to UPSERT or MERGE data with Tableau Prep’s write-back functionality
Arrow icon darkArrow icon dark

How to UPSERT or MERGE data with Tableau Prep’s write-back functionality

A demonstration of how to simply apply the straightforward concept of MERGE and UPSERT in Tableau Prep.

Data Engineering
Blog
Tableau
What is dbt?
Arrow icon darkArrow icon dark

What is dbt?

Want to know more about dbt as a data transformation and data engineering solution? Read our dbt guide to get started.

Data Engineering
Blog
dbt
dbt Configuration: YAML file
Arrow icon darkArrow icon dark

dbt Configuration: YAML file

Learn the basics of configuring your YAML files for dbt the right way.

Data Engineering
Blog
dbt
Improving your Data Quality in 7 Steps
Arrow icon darkArrow icon dark

Improving your Data Quality in 7 Steps

Want to improve your data quality? Learn how to improve data quality in 7 steps here. Read the full article.

Data Engineering
Blog
Why automate your data pipelines?
Arrow icon darkArrow icon dark

Why automate your data pipelines?

Thinking of building your own data pipelines? This article explains why that's not always the best option.

Data Engineering
Blog
Fivetran
Using dbt to model GA4 raw data
Arrow icon darkArrow icon dark

Using dbt to model GA4 raw data

Learn how to leverage dbt to model GA4 raw data for in-depth analysis and insights.

Data Engineering
Blog
dbt
What is Salesforce Data 360: The Ultimate Guide
Arrow icon darkArrow icon dark

What is Salesforce Data 360: The Ultimate Guide

Discover how Salesforce Data 360 unifies customer data, enhances decision-making, and powers AI-driven innovations.

Data Engineering
Blog
Data Cloud
Snowflake vs Salesforce Data 360
Arrow icon darkArrow icon dark

Snowflake vs Salesforce Data 360

Discover how Snowflake and Salesforce Data Cloud can complement each other to create an integrated, scalable, and actionable data strategy.

Data Engineering
Blog
Data Cloud