Make Your Resume Now

Data Engineer InterOp COE

Posted November 25, 2025

Job Overview

About Us

Abacus Insights is changing the way healthcare works for you. We’re on a mission to unlock the power of data so health plans can enable the right care at the right time—making life better for millions of people. No more data silos, no more inefficiencies. Just smarter care, lower costs, and better experiences.

Backed by $100M from top VCs, we’re tackling big challenges in an industry that’s ready for change. And while GenAI is still new for many, we’ve already mastered turning complex healthcare data into clear, actionable insights. That’s our superpower—and it’s why we’re leading the way.

Abacus, innovation starts with people. We’re bold, curious, and collaborative—because the best ideas come from working together. Ready to make an impact? Join us and let's build the future, together.

About the Role:

Come join our team! Help us tackle the data usability challenge for payers. Your expertise and experience will help drive meaningful performance outcomes. You'll also have the chance to advance your career, acquire new skills, and collaborate with some of the most innovative minds in payer data management.

We are seeking an experienced Senior Data Engineer to join our Connector Factory team. This role offers an opportunity to be a key contributor in a critical feature delivery team, where your expertise will guide the evolution of our data pipeline infrastructure. Our team is responsible for the development and operation of data pipelines that handle diverse data sources through both large batch and streaming systems. You'll work extensively with AWS services and play a crucial role in driving the growth and innovation of our platform.

You will:

  • Design and Develop Data Systems: Architect, build, and maintain data pipelines and ETL processes utilizing tools such as Databricks, Snowflake, SQL, and PySpark.
  • Enhance Data Quality: Play a pivotal role in creating and optimizing data assets to uphold high standards of data quality, performance, and reliability.
  • Manage Data Pipelines: Actively monitor and troubleshoot data pipelines to ensure efficient and uninterrupted data distribution.
  • Collaborate Across Teams: Partner with the Connector Factory team and cross-functional teams to understand client data requirements and transform these into scalable data solutions.
  • Implement Agile Practices: Apply Agile methodologies and best practices to drive incremental improvements and adapt to emerging requirements.
  • Communicate Effectively: Keep communication channels open with stakeholders to gather and clarify requirements and provide regular updates on project progress.
  • Ensure Data Security: Stay committed to data privacy, security, and regulatory compliance, particularly given the sensitive nature of healthcare data.

What we're looking for:

  • Educational Background: Bachelor’s degree in computer science, Engineering, or a related field. Advanced degrees are a plus.
  • Extensive Experience: A minimum of 5 years of experience in data engineering and big data architecture.
  • Technical Expertise: Deep knowledge of designing and maintaining big data architectures, including data lakes, columnar databases, large batch processing (Spark), and stream processing (Kafka).
  • Cloud Proficiency: Strong experience with AWS data services and building scalable, distributed systems on cloud platforms.
  • Programming Skills: Proficiency in Python or other object-oriented programming languages.
  • Data Analysis Skills: Hands-on experience with data analysis and modeling of large data sets.
  • Project Management: Strong organizational skills and experience managing complex projects.
  • Root Cause Analysis: Proven ability to perform root cause analysis on data processes to improve efficiency and resolve business queries.
  • Adaptability: A willingness to learn about new technologies and adapt to changing environments.
  • Independent and Collaborative Work: Ability to self-direct tasks and effectively collaborate within a technical team.
  • Infrastructure Automation: Familiarity with tools such as Terraform and GitLab CI/CD for infrastructure automation.
  • Business Acumen: Comfort with ambiguity and a keen interest in solving business-related problems.
  • Agile Experience: Background working in an Agile delivery framework.

Bonus Points:

  • Relevant certifications in data engineering, cloud computing, or specific technologies such as Databricks, Snowflake, or AWS.

Our Commitment as an Equal Opportunity Employer

As a mission-led technology company helping to drive better healthcare outcomes, Abacus Insights believes that the best innovation and value we can bring to our customers comes from diverse ideas, thoughts, experiences, and perspectives. Therefore, we dedicate resources to building diverse teams and providing equal employment opportunities to all applicants. Abacus prohibits discrimination and harassment regarding race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state, or local laws.

At the heart of who we are is a commitment to continuously and intentionally building an inclusive culture—one that empowers every team member across the globe to do their best work and bring their authentic selves. We carry that same commitment into our hiring process, aiming to create an interview experience where you feel comfortable and confident showcasing your strengths. If there’s anything we can do to support that—big or small—please let us know.

Ready to Apply?

Take the next step in your career journey

Stand out with a professional resume tailored for this role

Build Your Resume – It’s Free!