Product spotlight: Four new dbt Cloud features you need to know about
Aug 23, 2024
ProductWe have been shipping a ton of new features in dbt Cloud. So, we decided to kick off a new Product Spotlight Series to show how they can improve your analytics workflows. We'll highlight column-level lineage and lineage lenses, our new chatbot Ask dbt, unit tests, and job chaining for easier pipeline orchestration...complete with demo videos, which you'll find below 👇
Column-level lineage and lineage lenses
First up is some cool capabilities in dbt Explorer to help you navigate, understand, and troubleshoot your entire data estate more seamlessly. Think of dbt Explorer as the data catalog for all your dbt assets. You can instantly visualize your project's DAG and understand resource dependencies. Impact analysis and pipeline troubleshooting are no longer stressful, mind-numbing exercises. With dbt Explorer and column-level lineage, you can visualize relationships between sources and models and where they're used downstream in other projects, metrics, and dashboards.
Using lineage lenses, it's easy to dive deeper to grok critical details like model execution status, materialization type, test status, and query count metrics. Let's say you get a request from a stakeholder to add a new parameter to their data pull. You no longer have to worry that you'll break something downstream. With column-evolution lens, you can track exactly how columns flow, transform, and are renamed across your pipeline. dbt Explorer and these new capabilities help you build, troubleshoot, and analyze workflows more efficiently. They also keep your stakeholders happy since they can quickly get the trusted data they need.
Ask dbt
You didn't think we'd have a product spotlight series without talking about AI, did you? ICYMI, dbt Cloud now offers a native in-app chatbot called Ask dbt as part of our Snowflake native app. Democratize analytics by allowing your stakeholders to ask questions like “What is ARR growth over time?” or “What is the count of customers by plan type?”. They'll get (governed, consistent, accurate) answers without writing a single line of SQL.
What makes it different than other chatbots? Context. Ask dbt is powered by the metrics defined in the dbt Semantic Layer, improving accuracy by 3x as observed in our benchmark. With Ask dbt, users can ask questions in natural language and receive insights in an understandable format. It can significantly speed up business processes and decision-making. The best news? The dbt Semantic Layer also powers your insights for other analytics endpoints. Whether it's a BI interface like Tableau or Google Sheets or a personalized embedded analytics experience, you can be confident that metrics are consistent across your varied user experiences.
Unit testing
You asked, we listened. Unit testing is now live in dbt. And by "we" I really do mean the "royal we". Thank you to everyone in the dbt community who helped make unit testing a reality. 🧡 While dbt users have long been able to build assertions (or tests) about their dbt models (for example: is not null
, is unique
, etc.), unit tests allow you to validate the behavior of model logic before the model is materialized with real data. Not just "Will this model build?" but "Will it build what I expect it to build?" If a unit test fails, the model won’t build. This saves you from unnecessary data platform spend while improving data product reliability and mitigating the risk of introducing breaking changes into your pipeline.
Lean on unit tests in dbt to:
- Save costs: Validate logic before transforming a full production dataset.
- Improve code reliability: Reduce risk of breaking changes in production.
- Collaborate at scale: Create stable and reliable interfaces for cross-team collaboration.
Job chaining
Automation is the name of the game, and dbt Cloud just got an upgrade when it comes to orchestrating your jobs. Sure, cron jobs and manual configs work, but as your dbt projects and dependencies grow, that approach becomes unwieldy and untenable. With job chaining, you can trigger jobs to run automatically as soon as an upstream job successfully completes, both within and across projects. All you have to do is configure the jobs you want to chain and flip a toggle.
Using job chaining, you can:
- Optimize compute spend: Only run a downstream model once an upstream job has successfully completed.
- Reduce undifferentiated heavy lifting: Don't manually trigger jobs, let the computers do it for you.
- Reclaim control: Build flexibility into how you orchestrate your jobs, so your stakeholders are always working from the freshest data.
That's a wrap
Thanks for joining us for our first-ever Product Spotlight Series! We hope you learned something new about how dbt Cloud's latest features can improve your analytics workflows and foster data collaboration within your organization. As always, we love your feedback. Please continue to send it along via your rep, our community Slack, or in-app. We hope to see you IRL soon at Coalesce 2024, which is happening October 7 - 10 in Las Vegas (or online. We'll geek out over features like these that are helping us transform data analytics together.
Last modified on: Sep 12, 2024
Set your organization up for success. Read the business case guide to accelerate time to value with dbt Cloud.