dbt
AXS

AXS delivers business value with an analytics engineering workflow

This is the story of how the ticketing industry leader improved data velocity with dbt Cloud

AXS
50%faster to deploy new models, using incremental models
10%faster troubleshooting, using lineage graph
40%saved work hours on maintenance

AXS: selling over 65 million tickets a year

AXS is a global leader in ticketing, serving North America, Europe, Asia, Australia, and New Zealand. They are the ticketing platform of choice for major festivals (like Coachella), over 500 venues (including the O2 Arena), and popular sports teams (LA Galaxy, LA Kings, and more).

Providing data for diverse stakeholders

The AXS data team is responsible for complying, transforming, and sharing data with their customers (venues and promoters), internal users (from marketing to business intelligence teams), and operationally, end users (fans). Like any e-commerce platform, data plays a pivotal role in driving revenue across the business, by enabling activities such as:

  • Pricing recommendations: Calculating optimal ticket prices to increase average revenue per ticket and events’ sell-through rate.
  • Conversion optimization: Measuring customer journey to uncover opportunities to drive conversion rates up.
  • Marketing: Reporting on sources and channels leading to purchases, and sharing data with marketing channels to improve campaign performance.
  • Predictive analytics: Forecasting the company’s growth trajectory and events’ performance (pre-sales metrics).
  • Fraud detection: Reducing fraud—a huge problem in the ticketing industry.
  • Customer experience: Improving the fan experience with relevant content, event recommendations, and more.

Identifying a gap: analytics engineering best practices

To turn raw data into actionable insights faster than ever before, the data team needed to build a new way of working with analytics engineering best practices. The team identified the following areas of focus::

  • Observability: improving visibility into workflows and errors, including clear data lineage, version control, governance, and alerting.
  • Data consistency: Metrics should be consistent across data sets to prevent confusion and data quality issues.
  • Automated ETL process: shifting data ingestion and transformation from a a manual task to an automated, fit-for-purpose workflow less prone to errors and long hours of troubleshooting.

To achieve their analytics engineering vision, the team searched for new tools that would enable the workflow and technological shift.

Choosing dbt Cloud, starting with dbt Core

The must-have requirement that led AXS’ search was version control. To align development practices across teams, the new stack would have to align with internal users’ (data engineers and analytics engineers) preferences—git integration, and support for Python and SQL.

The AXS team started with a two-week proof of concept on dbt Core, before choosing dbt Cloud for lower complexity and ease of onboarding:

“dbt Cloud is great because it’s so approachable. You can log in with your browser, select a project, and start building models right away,” said Michael Colella, Senior Director of Data & Analytics at AXS.

Reaping the benefits of an analytics engineering workflow

Improved collaboration with a common workflow

Today, all data projects across AXS’ global regions are centralized on dbt Cloud. Teams can check on existing jobs, write new jobs, rerun models, and start new development all in the same place.

“dbt Cloud helped us remove complexities and gave us a framework in analytics engineering for organizing our work,” explained Michael. “By centralizing data work in one tool with one language (SQL), it created a commonality among our tech teams and facilitated communication. That was very powerful.”

Driving data quality with testing and documentation

The dbt workflow involves building staging models, using a governance layer, and testing before rolling out any production changes—directly improving data quality. dbt Cloud’s built-in documentation, now updated whenever engineers perform code changes, maintains traceability so future team members can uphold that data quality too.

“Three years after an engineer has left, you can see what they did and why they did it,” said Michael. “I can’t overstate enough how convenient it is to have the documentation right in dbt.”

Easier debugging with lineage and automated alerts

When issues do get by the data governance guardrails, dbt makes it easy to figure out what went wrong. AXS has leveraged dbt Cloud to facilitate root-cause analysis, using lineage graphs:

“One of the great things with dbt is easier debugging. To debug faster, you need visibility into your pipelines and solution architecture,” explained Michael. “dbt’s lineage graphs tie your models together to visualize downstream impact and see how things are working overall.”

The data team also takes advantage of automated job scheduling and alerting. Jobs are set up once and if they fail, the team is automatically notified via Slack. This decreases the amount of time dedicated to maintenance and enables data practitioners to spot bugs much earlier.

50% faster deploy times and lower time-to-value

Since migrating to a modern data stack and implementing an analytics engineering workflow, AXS has achieved efficiency gains. With the help of features like incremental models—where only modified data is loaded—the team cut deployment time by 50%. This has led to faster processing times and decreased warehouse costs.

Data velocity‌ has improved; macros enable engineers to reuse pieces of code, reducing the net new code required. And since the team now spends less time on data munging, they can better focus on delivering value.

Read more case studies

Symend implements a robust data foundation fit for scale with dbt Cloud

Read Case Study

Siemens implements a data mesh architecture at scale with dbt Cloud

Read Case Study

Rocket Money modernizes financial reporting with dbt Cloud

Read Case Study