Skip to main content
Home / Services / Snowflake & DBT

Technology expertise

Snowflake & DBT — modern data engineering, CI/CD and cost optimisation

Decinova builds and maintains your data pipelines on Snowflake with DBT: tested and documented models, CI/CD via GitHub Actions, cost optimisation, and performance tuning.

Snowflake DBT Core / Cloud SQL Jinja GitHub Actions Apache Iceberg Cortex AI
snowflake — usage
340
DBT models
2.1TB
Storage
-42%
Costs
98%
Tests OK

Our Snowflake expertise & DBT

Cloud-native data engineering, with no compromise on rigour

Snowflake and DBT are today's reference tandem for modern data engineering. Snowflake provides elastic compute and virtually unlimited storage. DBT brings software engineering practices to SQL transformations: version control, testing, documentation, CI/CD.

But this power comes with complexity. A poorly sized warehouse can blow up your bill. Untested DBT models can silently produce incorrect data. Without governance, your lakehouse turns into a data swamp.

Decinova covers the full stack: design and maintenance of DBT models, Snowflake cost optimisation (warehouse sizing, query tuning, credit monitoring), CI/CD pipelines, and migration from legacy platforms.

CI/CD
GitHub Actions · dbt build --select state:modified+ · Slim CI
Orchestr.
dbt Cloud / Airflow / Dagster · Scheduling · Alerting
Transform
DBT Core · SQL + Jinja · Tests · Documentation · Macros
Warehouse
Snowflake · Warehouses · Dynamic Tables · Iceberg · Cortex
Visualis.
Power BI · Tableau · Streamlit · Semantic Views
TMA / MCO

DBT models

SQL/Jinja model maintenance, test creation, documentation, refactoring. Dev/Staging/Prod environment management.

SQLJinjaTests
TMA / MCO

Snowflake Admin

Warehouse management, roles and grants, cost monitoring, network and security configuration.

RBACWarehousesMonitoring

CI/CD DBT

Tested transformation pipelines, versioned and automatically deployed

DBT's strength is that it treats SQL as code. And like code, it must be versioned in Git, tested at every change, and deployed through a controlled pipeline. This is where most organisations struggle.

Decinova implements a complete CI/CD workflow with GitHub Actions: every pull request automatically triggers a dbt build on a staging environment, runs tests, validates documentation, and deploys to production only after approval.

The result: data transformations as reliable and traceable as a software deployment. Every change is documented, every test is logged, every rollback is possible.

Slim CI: we only test what changed. With state:modified+, DBT identifies modified models and their descendants, reducing CI execution time by 80% on large projects.

github-actions — dbt ci/cd
// PR #247 — feat: ajout dim_clients_v2

trigger: pull_request → main

// Step 1 — Slim CI build
$ dbt build --select state:modified+
dim_clients_v2 — OK
fct_orders — OK (downstream)
12 tests passed — 0 failures

// Step 2 — Documentation
$ dbt docs generate
coverage: 94% documented

// Step 3 — Merge → deploy prod
$ dbt build --target prod
✓ deployed — 2 models updated
warehouse: TRANSFORM_WH_MEDIUM
duration: 47s · 0.8 credits

Snowflake Optimization

Control your Snowflake costs without sacrificing performance

Snowflake bills by credit consumed — and credits add up fast. An XL warehouse running continuously when a Small would suffice, poorly optimised queries scanning entire tables, unused materialized views: costs spiral quickly.

Companies waste an average of 30-70% of their Snowflake credits. Not through negligence, but because cost optimisation requires specific expertise that most data teams don't have.

Decinova audits your consumption, identifies waste sources, and implements optimisations. We then monitor cost trends and alert before overruns.

Diagnostic

Consumption audit

Analysis of query history, warehouse usage, and credit consumption by team. Identification of the top 10 waste sources.

Query ProfileCredits
Optimisation

Warehouse Sizing

Right-sizing warehouses by workload. Optimal auto-suspend/auto-resume configuration. Multi-cluster for peak loads.

SizingAuto-suspend
Performance

Clustering & Pruning

Clustering keys on the most filtered columns. Micro-partition pruning verification. Dynamic tables for incremental loads.

ClusteringDynamic Tables
Monitoring

Resource Monitors

Consumption alerts, budgets by warehouse and team, credit tracking dashboards.

AlertesBudgets
30-70%
Credits wasted on average
60s
Auto-suspend recommended
10x
Pruning gain with clustering
24/7
Monitoring continu

Migration to Snowflake & DBT

From SQL Server, SAP BO, or SSIS — with our proprietary tools

Migrating to Snowflake and DBT means transforming your storage infrastructure, your transformation layer, and your operational model simultaneously. Without the right tools, this is a multi-month project.

Decinova accelerates this transition with three proprietary tools. Our SAP BO converter automatically translates universe SQL syntax for Snowflake. Our SSIS converter transforms DTSX packages into DBT models. And our audit scanner maps the entire legacy estate in 48 hours.

Snowflake also provides SnowConvert AI, which assists T-SQL to Snowflake SQL conversion. Combined with our tools, this covers the full spectrum of legacy-to-modern migration.

Wave-based approach: audit → DWH migration → transformation migration → report migration → parallel run → cutover. Each wave is validated before the next begins.

Plan your migration →

Migration scenarios to Snowflake & DBT
SQL Server DWH
Snowflake
SnowConvert
T-SQL Procedures
Snowflake SQL
SnowConvert
SSIS Packages
DBT Models
Auto · Decinova
SAP BO Universes
Snowflake SQL
Auto · Decinova
Oracle / Teradata
Snowflake
SnowConvert
Flat Files / CSV
Snowflake Stages

Snowflake now natively supports Apache Iceberg, including bidirectional access with Microsoft Fabric. Your data remains accessible from both platforms without duplication.

Snowflake platform · State of the art 2026

Snowflake is no longer just a data warehouse

Snowflake has become a complete data platform: warehouse, lakehouse (Iceberg), AI (Cortex), native applications, data sharing, and marketplace. The ecosystem grows with each release.

Data Warehouse

Elastic compute

Auto-scalable warehouses, storage/compute separation, pay-per-use. Snowflake's historical core — and still its main strength.

Lakehouse

Apache Iceberg

Managed or external Iceberg tables, Polaris Catalog, bidirectional access with Fabric and Databricks. Open formats, zero lock-in.

AI / ML

Cortex AI

Built-in AI functions (AI_COMPLETE, AI_FILTER, classification), ML Jobs, model fine-tuning. AI directly inside the warehouse.

Governance

Horizon Catalog

Automatic sensitive data classification, lineage, data quality metrics, access policies. Governance at the lake level.

Applications

Native Apps & Streamlit

Applications deployed directly within Snowflake. Streamlit for interactive dashboards. Marketplace for distribution.

Partage

Data Sharing

Real-time data sharing between Snowflake accounts, without copying. Clean Rooms for secure inter-organisation collaboration.

Related articles

Our publications on Snowflake & DBT

DBT

DBT en production : bonnes pratiques de test, documentation et CI/CD

Unit tests, data quality tests, Slim CI with state:modified+, GitHub Actions — how to industrialise DBT.

Snowflake

Optimise your Snowflake costs : warehouse sizing, clustering et bonnes pratiques

The most common mistakes, how to detect them, and the optimisations that cut your bill by 30-50%.

Migration

Migrating a SQL Server data warehouse to Snowflake: methodology and field report

Key steps, pitfalls to avoid, SnowConvert AI usage, and real figures from an actual migration.

Let’s talk about your Snowflake & DBT project

Building, optimising or migrating?

Whether you're starting from scratch, looking to optimise costs, or planning a migration from your legacy stack, we have the expertise and tools to help.