Make Your Resume Now

Data Architect

Posted February 25, 2026

Job Overview

The Role:

We are looking for hands-on Principal Data-Architect for a challenging and fun filled work of building a data architecture that is future proof for one of the customers in the financial domain.

Responsibilities:

  • Own design and maintenance of all aspects of data solutions including modeling, developing, technical documentation, data diagrams and data dictionaries.

  • Provide expertise in the development of standards, architectural governance, design patterns, and practices, evaluate best applicable solutions for different use cases

  • Determines and develops architectural approaches and solutions, conducts business reviews, documents current systems, and develops recommendations

  • Lead the data strategy and own the vision and roadmap of data products

  • Work with stakeholders to ensure that data related business requirements for protecting sensitive data are clearly defined, communicated, and well understood and considered as part of operational prioritization and planning

  • Develop, maintain, and optimize data infrastructure using Delta Lake, MLflow, and Databricks SQL to enhance data management, processing, and analytics.

  • Utilize Snowflake’s features such as data sharing, zero-copy cloning, and automatic scaling to optimize data storage, accessibility, and performance. Ensure effective management of both semi-structured and structured data within Snowflake’s architecture.

  • Implement and manage data storage solutions using Amazon S3, perform data warehousing with Amazon Redshift.

  • Design and implement data integration workflows using AWS Glue to orchestrate and automate data movement and transformation.

  • Design and implement scalable data pipelines using tools like Apache Kafka or Apache Airflow to facilitate real-time data processing and batch data workflows.

  • Apply advanced analytics techniques, including predictive modeling and data mining, to uncover insights and drive data-driven decision-making.

Requirements:

  • Overall experience of 15+ years

  • Data architecture, data platform, data warehouse related experience of 10+ years

  • Hands on experience with snowflake - 4+ years experience, Data bricks 4+ years

  • Between snowflake and Data bricks (5+ years experience)

  • Proficiency in features such as Delta Lake, MLflow, and Databricks SQL. Experience in managing Spark clusters and implementing machine learning workflows.

  • Solid experience in emerging and traditional data stack components such as: batch and real time data ingestion, ETL, ELT, orchestration tools, on-prem and cloud DW, Python, structured, semi and unstructured databases

  • Knowledge of features like Snowflake’s data sharing, zero-copy cloning, and automatic scaling. Experience in working with Snowflake’s architecture for semi-structured and structured data.

  • Experience with services like Amazon S3, Amazon Redshift, and AWS Glue.

  • Proficiency in tools such as Apache NiFi, Talend, Informatica, or Microsoft SQL Server Integration Services (SSIS).

  • Experience in designing and implementing data pipelines using tools like Apache Kafka or Apache Airflow.

  • Ability to perform data profiling, data quality assessments, and performance tuning.

  • Experience in comparing and evaluating different data technologies based on criteria like performance, scalability, and cost.

  • Skills in applying advanced analytics techniques, including predictive modeling and data mining.

  • Expert with industry standard data practices, data strategies and data concepts

  • Demonstrated experience in architecting/re-architecting complex data systems and data models.

  • Demonstrated experience in overall system design, including database selection and solutioning.

Nice-to-Have Skills:

  • Experience with data governance tools such as Collibra or Alation.

  • Knowledge of data quality frameworks and standards, such as Data Quality Dimensions (completeness, consistency, etc.).

  • Familiarity with tools like Apache Beam or Luigi for managing complex data workflows.

  • Awareness of emerging data technologies such as data mesh, data fabric, and real-time data processing frameworks.

Ready to Apply?

Take the next step in your career journey

Stand out with a professional resume tailored for this role

Build Your Resume – It’s Free!