Technology expertise
Decinova builds and maintains your data pipelines on Snowflake with DBT: tested and documented models, CI/CD via GitHub Actions, cost optimisation, and performance tuning.
Our Snowflake expertise & DBT
Snowflake and DBT are today's reference tandem for modern data engineering. Snowflake provides elastic compute and virtually unlimited storage. DBT brings software engineering practices to SQL transformations: version control, testing, documentation, CI/CD.
But this power comes with complexity. A poorly sized warehouse can blow up your bill. Untested DBT models can silently produce incorrect data. Without governance, your lakehouse turns into a data swamp.
Decinova covers the full stack: design and maintenance of DBT models, Snowflake cost optimisation (warehouse sizing, query tuning, credit monitoring), CI/CD pipelines, and migration from legacy platforms.
SQL/Jinja model maintenance, test creation, documentation, refactoring. Dev/Staging/Prod environment management.
Warehouse management, roles and grants, cost monitoring, network and security configuration.
CI/CD DBT
DBT's strength is that it treats SQL as code. And like code, it must be versioned in Git, tested at every change, and deployed through a controlled pipeline. This is where most organisations struggle.
Decinova implements a complete CI/CD workflow with GitHub Actions: every pull request automatically triggers a dbt build on a staging environment, runs tests, validates documentation, and deploys to production only after approval.
The result: data transformations as reliable and traceable as a software deployment. Every change is documented, every test is logged, every rollback is possible.
Slim CI: we only test what changed. With state:modified+, DBT identifies modified models and their descendants, reducing CI execution time by 80% on large projects.
Snowflake Optimization
Snowflake bills by credit consumed — and credits add up fast. An XL warehouse running continuously when a Small would suffice, poorly optimised queries scanning entire tables, unused materialized views: costs spiral quickly.
Companies waste an average of 30-70% of their Snowflake credits. Not through negligence, but because cost optimisation requires specific expertise that most data teams don't have.
Decinova audits your consumption, identifies waste sources, and implements optimisations. We then monitor cost trends and alert before overruns.
Analysis of query history, warehouse usage, and credit consumption by team. Identification of the top 10 waste sources.
Right-sizing warehouses by workload. Optimal auto-suspend/auto-resume configuration. Multi-cluster for peak loads.
Clustering keys on the most filtered columns. Micro-partition pruning verification. Dynamic tables for incremental loads.
Consumption alerts, budgets by warehouse and team, credit tracking dashboards.
Migration to Snowflake & DBT
Migrating to Snowflake and DBT means transforming your storage infrastructure, your transformation layer, and your operational model simultaneously. Without the right tools, this is a multi-month project.
Decinova accelerates this transition with three proprietary tools. Our SAP BO converter automatically translates universe SQL syntax for Snowflake. Our SSIS converter transforms DTSX packages into DBT models. And our audit scanner maps the entire legacy estate in 48 hours.
Snowflake also provides SnowConvert AI, which assists T-SQL to Snowflake SQL conversion. Combined with our tools, this covers the full spectrum of legacy-to-modern migration.
Wave-based approach: audit → DWH migration → transformation migration → report migration → parallel run → cutover. Each wave is validated before the next begins.
Snowflake now natively supports Apache Iceberg, including bidirectional access with Microsoft Fabric. Your data remains accessible from both platforms without duplication.
Snowflake platform · State of the art 2026
Snowflake has become a complete data platform: warehouse, lakehouse (Iceberg), AI (Cortex), native applications, data sharing, and marketplace. The ecosystem grows with each release.
Auto-scalable warehouses, storage/compute separation, pay-per-use. Snowflake's historical core — and still its main strength.
Managed or external Iceberg tables, Polaris Catalog, bidirectional access with Fabric and Databricks. Open formats, zero lock-in.
Built-in AI functions (AI_COMPLETE, AI_FILTER, classification), ML Jobs, model fine-tuning. AI directly inside the warehouse.
Automatic sensitive data classification, lineage, data quality metrics, access policies. Governance at the lake level.
Applications deployed directly within Snowflake. Streamlit for interactive dashboards. Marketplace for distribution.
Real-time data sharing between Snowflake accounts, without copying. Clean Rooms for secure inter-organisation collaboration.
Related articles
Let’s talk about your Snowflake & DBT project
Whether you're starting from scratch, looking to optimise costs, or planning a migration from your legacy stack, we have the expertise and tools to help.