Make Your Resume Now

SQL Migration Data Engineer

Posted January 16, 2026
Salaried, full-time USD 97,000.0 - 125,000.0

Job Overview

About the role

  • API & DataLakes Data Engineer (Azure) - SQL Migration

What you'll do

  • Collaborate with the delivery team on a Server 2012 to Azure SQL/Fabric Lakehouse migration, including assessment, planning, and execution
  • Develop and optimize ETL/ELT processes to migrate legacy SQL 2012 databases to modern cloud data platforms, minimizing data loss and downtime
  • Design and build data pipelines using Azure Data Factory, Databricks, and Microsoft Fabric Lakehouse to transform monolithic databases into distributed Lakehouse architectures
  • Develop APIs and data services on top of Microsoft Fabric Lakehouse to expose migrated data for downstream applications and stakeholders
  • Collaborate with infrastructure and application teams to assess legacy SQL 2012 environments, identify technical debt, and plan phased migration approaches
  • Develop infrastructure and automation required for optimal extraction, transformation, and loading of data from SQL Server 2012 and other legacy sources using SQL, dbt, Python, and Fabric technologies
  • Define and document cloud solution architectures, migration roadmaps, and technical designs for data modernization initiatives
  • Generate and document unit tests, performance benchmarks, and migration validation scripts
  • Establish data quality frameworks and governance practices for migrated data assets in Lakehouse environments


Qualifications

  • Bachelor's Degree in Computer Science or related fiel; Azure Cloud Certifications strongly preferred
  • At least 3 years of Data Engineering experience, with 1+ years specifically in SQL Server migrations to cloud platforms
  • Hands-on experience with SQL Server 2012 architecture, T-SQL optimization, and migration patterns (compatibility issues, index strategies, etc.)
  • Proficiency in Azure Data Factory, Synapse Analytics, Azure SQL, Data Lake Storage, and Microsoft Fabric (especially Lakehouse), including data modeling, partitioning, and optimization for analytical workloads
  • Demonstrated experience building APIs or data services on top of Lakehouse/Delta Lake architectures
  • Proficiency with dbt for transformation logic and data lineage documentation
  • Strong command of Python, SQL, T-SQL, and scripting for automation and data validation
  • Experience with Azure Infrastructure-as-Code (Bicep, ARM templates, Terraform)
  • Experience building CI/CD pipelines for data infrastructure
  • Knowledge of data governance, metadata management, and data quality frameworks
  • Ability to work independently in Agile environments with minimal supervision on external client projects


Other Info

  • Candidates must be able to accommodate a working schedule of 8:30am - 5:30pm EST, Monday through Friday.
  • We are unable to provide sponsorship for this position at this time.
  • Benefits may include: Medical, Dental, and & Vision Insurance. Life, Short Term Disability, and Long Term Disability Insurance; accrued time off (25 days/year), paid holidays; annual target bonus; Company sponsored 401(k) plan; Monthly wellness/tech stipends.
  • Position may require infrequent travel (to client sites), estimated up to 5%. This role is otherwise remote (US based).
  • Successful candidates will be subject to thorough background checks, including education, employment, and physical location verifications.

Ready to Apply?

Take the next step in your career journey

Stand out with a professional resume tailored for this role

Build Your Resume – It’s Free!