Back to Careers
DBT Engineer
We are seeking an experienced DBT Engineer with good experience in Azure Cloud, DBT (Data Build Tool), and Snowflake. The ideal candidate will have a good background in building scalable data pipelines, designing efficient data models, and enabling advanced analytics.
You’ll collaborate with cross-functional teams to deliver robust and efficient software systems.
No. of Positions : 02   /   Location : Fully Remote (Preferred candidates in Pune, but open to all)
Work Hours: 4 PM to 1:30 AM IST with daylight savings (Till 3 PM CST)
See Requirements below
Submit Your Application
Thank you!
Your submission has been received!
Oops!
Something went wrong while submitting the form.

Key Responsibilities

  • Design and maintain scalable ETL pipelines with DBT and SQL, ensuring high performance and reliability.
  • Develop advanced DBT workflows using artifact files, graph variables, and complex macros leveraging run_query.
  • Implement multi-repo or mesh DBT setups to support scalable and collaborative workflows.
  • Utilize DBT Cloud features such as documentation, Explorer, CLI, and orchestration to optimize data processes.
  • Build and manage CI/CD pipelines to automate and enhance data deployment processes.
  • Write and optimize complex SQL queries to transform large datasets and ensure data accuracy.
  • Collaborate with cross-functional teams to integrate data solutions into existing workflows.
  • Troubleshoot and resolve errors in pipelines caused by DBT code or transformation issues.
  • Adhere to best practices for version control using git flow workflows to manage and deploy code changes.
  • Ensure code quality and maintainability by implementing code linting and conducting code reviews.

Must-Have Qualifications

  • 8+ years of experience in data engineering with a strong focus on ETL processes and data pipeline management.
  • MUST have experience in Azure cloud, working on Data warehousing involving ADF, Azure Data Lake, DBT and Snowflake
  • At least 4+ years of hands-on experience with DBT.
  • Advanced proficiency in SQL and data modeling techniques.
  • Deep understanding of DBT, including artifact files, graph usage, and MetricFlow.
  • Proficiency in DBT Cloud features like CLI, orchestration, and documentation.
  • Strong skills in Python for scripting and automation tasks.
  • Familiarity with CI/CD pipeline tools and workflows.
  • Hands-on experience with git flow workflows for version control.
  • Solid troubleshooting skills to resolve pipeline errors efficiently.
  • Knowledge of pipeline orchestration and automation.

Good to Have

  • A proactive problem-solver with excellent attention to detail.
  • Strong communication and collaboration skills to work with cross-functional teams.
  • A positive attitude and ownership mindset to drive projects to completion.