Production-Ready Toolkit

dbt + BigQuery
Performance Pack

Templates, not a course. Reduce wasted bytes scanned, ship faster, and prove ROI with production-ready playbooks, macros, and benchmarking tools. Copy, paste, ship.

30-day money-back guarantee Instant download dbt 1.6+ compatible
benchmark — example output
-- Before: full table scan
SELECT * FROM events
847 GB scanned → $5.29 per run
45.2s runtime

-- After: Performance Pack applied
SELECT user_id, event_type
FROM events
WHERE dt >= '2024-01-01'
12 GB scanned → $0.08 per run
3.1s runtime

✓ Bytes scanned: 847 GB → 12 GB
✓ Example benchmark (results vary)
6
Playbooks
18
dbt Macros
30+
SQL Snippets
68+
Files Ready
Quickstart

Install & validate in 10 minutes

Copy into your dbt repo, run one analysis, then benchmark before/after. No magic — just production patterns.

Option A — local package
Register in packages.yml
Copy dbt_package/ch_bq_pack/ from the download into your repo (or keep it next to your dbt project), then reference it as a local package.
packages.yml
packages:
  - local: ./dbt_package/ch_bq_pack
Then run: dbt deps
Option B — copy macros
Drop macros into your project
If you prefer zero package config, copy dbt_package/ch_bq_pack/macros/ directly into your project macros/ folder.
shell
cp -R dbt_package/ch_bq_pack/macros ./macros
Works with any dbt + BigQuery repo structure.
Validate cost & hotspots
Run a ready-made analysis
Start with the included BigQuery INFORMATION_SCHEMA queries to find top offenders (bytes billed, runtime, slot-time).
example file
analyses/cost/cost-01-top-20-most-expensive-queries.sql
Tip: set var('bq_region') if your jobs metadata isn't in region-us.
Benchmark before/after
Automate the comparison (optional)
Use the Python harness to run the same query multiple times and capture runtime + bytes billed. Great for stakeholder reports.
python
python scripts/benchmark/harness.py --sql path/to/query.sql --runs 3
Requires google-cloud-bigquery + ADC credentials.
What's Inside

Everything you need to optimize BigQuery

Six deep-dive playbooks with production-ready code. Not theory — validate in your environment and ship.

PLAYBOOK 01
Query Cost Optimization
Diagnose expensive queries with INFORMATION_SCHEMA, fix partition pruning, add clustering, and set cost guards that prevent runaway bills.
PLAYBOOK 02
Incremental Models & Merge
6 incremental strategies with tradeoffs. Safe MERGE patterns that don't create duplicates. Late-arriving data handling that actually works.
PLAYBOOK 03
Macros & Reusable Patterns
18 production macros: cost guard, safe merge, clone safety net, data quality assertions, Editions calculator, and more. Drop into any dbt project.
PLAYBOOK 04
Performance Tuning
BI Engine strategies, query caching exploitation, APPROX functions for 10x faster aggregations, and JOIN patterns most engineers miss.
PLAYBOOK 05
Benchmark Harness
Native SQL benchmarking with INFORMATION_SCHEMA. Measure cost and runtime before/after any change. Generate stakeholder reports automatically.
PLAYBOOK 06
Editions Migration
Decision framework + autoscaling analysis to choose Standard vs Enterprise vs Enterprise Plus—and avoid bursty-workload cost traps.
BONUS
dbt Package (Installable)
Everything packaged as a dbt-compatible package. Add it via packages.yml or copy the macros folder directly into your project.
📋
18 Analysis Queries
One file per query, run instantly
Cheat Sheet
1-page reference for your desk
🚀
Quickstart Guide
Running in 10 minutes
🐍
Python Harness
Optional advanced automation
Peek Inside

What the download looks like

dbt-bq-pack/
📄 QUICKSTART.md
📄 CHEATSHEET.md
📄 README.md
📁 docs/playbooks/
01-query-cost-optimization/PLAYBOOK.md
02-incremental-models-merge/PLAYBOOK.md
03-macros-reusable-patterns/PLAYBOOK.md
04-performance-tuning/PLAYBOOK.md
05-benchmark-harness/PLAYBOOK.md
06-editions-migration/PLAYBOOK.md
📁 dbt_package/ch_bq_pack/macros/
performance/cost_guard.sql
performance/partition_filter.sql
incremental/safe_merge.sql
quality/assert_not_null_proportion.sql
…and 14 more macros
📁 analyses/
cost/cost-01-top-20-expensive-queries.sql
monitoring/monitor-01-pipeline-health.sql
…and 16 more queries
📁 scripts/benchmark/
harness.py
requirements.txt
📁 scripts/
editions_calculator.py
roi_calculator.py
…and 2 more scripts
📁 docs/snippets/
cost-optimization/cost_snippets.sql
…and 3 more snippet files
Explore the Code

See exactly what you're shipping

Real production code. Click any file — this is what data engineers copy into their dbt projects on day one.

ch_bq_pack — VS Code ⌘K ⌘O
Ready
SQL
Ln 1, Col 1 UTF-8
ROI

Estimate monthly spend in 60 seconds

This is a simple on-demand estimate using bytes scanned. Use it to sanity-check impact and get stakeholder buy-in. (Results vary.)

70%
Assumptions: on-demand analysis pricing at $6.25 / TiB with first 1 TiB / month free. This is an estimate for planning.
Estimated monthly spend (today)
$0.00
After reduction
$0.00
Estimated savings
$0.00
Pricing

One purchase. Lifetime access.

No subscriptions. Pay once, use forever. Optional support add‑on available after purchase.

Solo License
$199
For 1 engineer
Unlimited personal & professional projects
All 6 playbooks
18 installable macros
30 SQL snippets
Benchmark harness
Cheat sheet + quickstart
Lifetime updates
Get Solo License
Company License
$999
Unlimited engineers
Organization-wide deployment
Everything in Team
Unlimited engineers
CI/CD integration rights
Internal tooling integration
Lifetime updates
Get Company License
🛡️ 30-day money-back guarantee. No questions asked.
FAQ

Common questions

A ZIP file with 68 production-ready files: 6 playbooks (deep-dive guides with code), 18 dbt macros, 18 analysis queries (one per file), a Python benchmark harness, cheat sheet, and quickstart guide. Everything is organized as a dbt-compatible package.
No. This is a code toolkit. Templates, macros, and SQL queries you copy into your dbt project and run immediately. The playbooks provide context and decision frameworks, but the core value is ready-to-use code.
dbt-core >= 1.6 with dbt-bigquery >= 1.6. The macros use standard Jinja2 — no experimental features. Includes microbatch examples for dbt 1.9+.
If you use dbt + BigQuery with on-demand pricing, yes. The playbooks and macros are designed for standard BigQuery SQL. You'll need to replace placeholder project/dataset names with your own (documented in the Quickstart).
The content is identical. The license determines how many engineers can use it: Solo (1 person), Team (up to 5 in your org), Company (unlimited in your org with CI/CD integration rights).
30-day money-back guarantee, no questions asked. Email support@chdatatools.com and you'll get a full refund.

Stop overpaying for BigQuery.

Join data engineers who ship faster and spend less. One purchase, lifetime access.