· Design, develop, and maintain scalable and reusable data models optimized for analytics and reporting use cases.
· Build and manage data transformations using dbt, including tests, documentation, and performance optimization.
· Implement and enforce data modeling best practices (star schema, dimensional modeling etc.).
· Ensure data quality, consistency, lineage, and reliability through testing, validations, and monitoring.
· Collaborate with analytics, product, and business stakeholders to understand and translate data requirements into robust models.
· Work with modern cloud data warehouses, primarily Snowflake, to support analytical workloads.
· Support BI and reporting needs as required, including troubleshooting data issues or enabling new datasets.
· Contribute to data platform improvements, standards, and best practices across the data engineering team.
· Strong hands‑on experience with dbt (data build tool) for data transformations, testing, and documentation.
· Advanced SQL skills with the ability to write efficient, maintainable analytical queries.
· Solid understanding of data modeling concepts, including star schemas, dimensional modeling, and fact/dimension design.
· Experience working with modern data warehouses, preferably Snowflake.
· Strong focus on data quality, performance optimization, and reliability.
· Ability to work closely with stakeholders and translate business requirements into analytical datasets.
· Experience with BI and visualization tools such as Microsoft Power BI or Looker.
· Ability to understand, support, and troubleshoot BI reports and dashboards.
· Exposure to Microsoft Azure data services such as Azure Data Factory (ADF) and Databricks.
· Familiarity with ELT/ETL orchestration, data governance, or metadata management.
Official notification
Any question or remark? just write us a message
If you would like to discuss anything related to payment, account, licensing,
partnerships, or have pre-sales questions, you’re at the right place.