Data Engineer (Snowflake, DBT, Airflow)
Full-time Mid-Senior LevelJob Overview
Role Overview
We are looking for an Intermediate Data Engineer to join the Enterprise Data Platform (EDP) team. The role focuses on building, maintaining, and optimizing scalable data pipelines that power investment, risk, research, and analytics use cases across the organization.
Key Responsibilities
Data Pipeline Engineering
- Design and develop robust ETL/ELT pipelines using Python, Airflow, and DBT.
- Build and optimize Snowflake data models for performance, scalability, and cost efficiency.
- Implement ingestion pipelines for internal and external financial datasets (Market, Securities, Pricing, ESG, Ratings).
Data Modeling & Transformation
- Develop DBT models using best practices (sources, staging, marts).
- Apply data quality checks, tests, and documentation within DBT.
- Ensure consistent data transformations aligned with EDP standards.
Orchestration & Monitoring
- Create and manage Airflow DAGs with dependency handling, retries, and alerting.
- Monitor pipeline health and troubleshoot failures proactively.
- Support incremental loads, backfills, and SLA-driven workflows.
Data Quality & Governance
- Implement validation, reconciliation, and completeness checks.
- Ensure adherence to data governance, security, and access control policies.
- Collaborate with stakeholders to resolve data discrepancies.
Collaboration & Delivery
- Work closely with data analysts, platform teams, and upstream data providers.
- Participate in design reviews, code reviews, and sprint planning.
- Contribute to reusable frameworks and EDP accelerators.
Make Your Resume Now