Interwell Health Logo

Interwell Health

Staff Data Engineer

Posted 12 Days Ago
Easy Apply
Remote
Hiring Remotely in United States
Senior level
Easy Apply
Remote
Hiring Remotely in United States
Senior level
Lead architecture and hands-on engineering to design, scale, and govern a cloud-native lakehouse using Databricks and Microsoft Fabric. Build batch/streaming pipelines, ensure HIPAA-compliant PHI handling, produce analytics-ready datasets, mentor engineers, and drive platform strategy, governance, and reliability.
The summary above was generated by AI

Interwell Health is a kidney care management company that partners with physicians on its mission to reimagine healthcare—with the expertise, scale, compassion, and vision to set the standard for the industry and help patients live their best lives. We are on a mission to help people and we know the work we do changes their lives. If there is a better way, we will create it. So, if our mission speaks to you, join us!

Reporting to the Director of Data Engineering, the Staff Data Engineer serves as a senior technical leader responsible for shaping, scaling, and governing our modern data ecosystem. This role blends architecture, hands-on engineering, platform leadership, and cross functional partnerships to deliver high quality data products that power clinical, operational, financial, and analytical outcomes. Deep experience with Databricks, Python, dbt, and Microsoft Fabric, along with strong fluency in healthcare data and compliance standards, is essential. At its core, you’ll work closely with teams across the organization to deliver governed, high‑quality, analytics‑ready data at scale. 

Our Tech Stack: Databricks, Delta Lake, Unity Catalog, Microsoft Fabric (OneLake, Lakehouse, Data Factory), Azure, dbt, Python, PySpark, Spark SQL. 

What You’ll Do: 

Architecture & Strategy 

  • Design and evolve a scalable, secure, cloud‑native lakehouse platform leveraging Databricks, Microsoft Fabric (OneLake, Lakehouse, Data Factory), and dbt. 
  • Define modeling patterns, governance frameworks, and engineering best practices across the data lifecycle. 
  • Lead design reviews and guide teams in adopting scalable architectural patterns. 
  • Drive long‑term platform strategy and evaluate emerging technologies. 

Hands-on Engineering 

  • Design and implement batch and streaming data pipelines for healthcare data sources (EHR, claims, HL7/FHIR, APIs, flat files, databases) 
  • Develop modular ingestion, quality, lineage, metadata, and observability frameworks that scale across domains. 
  • Produce clean, analytics‑ready datasets and data models for BI, analytics, and machine learning workloads. 
  • Implement HIPAA‑aligned access patterns and secure handling of PHI. 
  • Architect Databricks workloads (clusters, jobs, Unity Catalog, Delta Lake) for reliability, performance, and cost efficiency. 
  • Integrate Databricks and Microsoft Fabric with Azure services and enterprise systems. 

 Leadership & Collaboration 

  • Partner with product managers, data scientists, analysts, clinicians, and business stakeholders to translate healthcare data needs into scalable solutions. 
  • Lead Cross functional initiatives that modernize and unify the organization’s data ecosystem. 
  • Mentor senior and mid-level engineers; elevate team capability through technical coaching and standards. 
  • Drive roadmap planning, platform evolution, and long-term data strategy. 
  • Champion engineering excellence, reliability practices, documentation quality, and governance. 

What You’ll Need 

  • Bachelor's degree in Computer Science, Engineering, or a related field. 
  • 7+ years of experience in data engineering.  
  • 2+ years operating in a senior or staff level engineering role.  
  • Deep hands-on proficiency with Databricks, Spark, Delta Lake, dbt, and Python. 
  • Proven ability to design and operate largescale cloud data platforms (Azure preferred).on experience with 
  • Hands on experience with Data Engineering, Data Factory, Lakehouse, OneLake. 
  • Advanced data platform architecture and Lakehouse design expertise.  
  • Demonstrated ability to design modular, extensible frameworks and guide the long-term evolution of enterprise data platforms. 
  • Strong command of distributed data processing and cloud native engineering. 
  • Experience working in HIPAA regulated environments and handling PHI. 
  • Healthcare data fluency, including regulated data handling and compliance.  
  • Technical leadership, mentorship, and influence across teams. 
  • Strong communication skills with both technical and clinical stakeholders. 
  • Experience with platform reliability, CI/CD for data pipelines, and infrastructure as code. 
  • 100% remote (ET or CT work hours preferred) 

Preferred 

  • Experienced in implementing and supporting Epic integrations, leveraging Cogito Cloud and Caboodle data models, and delivering reliable incremental data pipelines from Caboodle/Clarity. 

 

Our mission is to reinvent healthcare to help patients live their best lives, and we proudly live our mission-driven values: 

  • We care deeply about the people we serve.
  • We are better when we work together.
  • Humility is a source of our strength.  
  • We bring joy to our work.
  • We deliver on our promises. 

We are committed to diversity, equity, and inclusion throughout our recruiting practices. Everyone is welcome and included. We value our differences and learn from each other. Our team members come in all shapes, colors, and sizes. No matter how you identify your lifestyle, creed, or fandom, we value everyone's unique journey.

Oh, and one more thing … a recent study shows that men apply for a job or promotion when they meet only 60% of the qualifications, but women and other marginalized groups apply only if they meet 100% of them. So, if you think you’d be a great fit, but don’t necessarily meet every single requirement on one of our job openings, please still apply. We’d love to consider your application!   

Come join us and help our patients live their best lives. Learn more at www.interwellhealth.com.

It has come to our attention that some individuals or organizations are reaching out to job seekers and posing as potential employers presenting enticing employment offers. We want to emphasize that these offers are not associated with our company and may be fraudulent in nature. Please note that our organization will not extend a job offer without prior communication with our recruiting team, hiring managers and a formal interview process.

Top Skills

Azure
Caboodle
Ci/Cd
Clarity
Cogito Cloud
Data Factory
Databricks
Dbt
Delta Lake
Epic
Infrastructure As Code
Lakehouse
Microsoft Fabric
Onelake
Pyspark
Python
Spark Sql
Unity Catalog

Similar Jobs

15 Days Ago
Easy Apply
Remote
United States
Easy Apply
175K-200K Annually
Senior level
175K-200K Annually
Senior level
Logistics • Marketing Tech • Software
Lead design and implementation of a unified event-tracking data platform. Collaborate with Data, Product, and Engineering to build reusable event-publishing frameworks, monitoring/alerting, and deprecate legacy systems. Coach mid-level engineers, influence roadmap and processes, manage cloud/SaaS spend, create documentation, and participate in on-call rotations to ensure platform reliability and data trustworthiness.
Top Skills: Snowflake,Redshift,Apache Kafka,Apache Flink,Apis,Git (Version Control),Containerization,Cicd,Postgresql,Elasticsearch,Dbt,Apache Airflow,Prefect,Change Data Capture (Cdc),Claudecode,Cursor,Jira
5 Days Ago
Remote
United States
199K-269K Annually
Expert/Leader
199K-269K Annually
Expert/Leader
Artificial Intelligence • Cloud • Consumer Web • Productivity • Software • App development • Data Privacy
Lead design and implementation of shared, reusable analytics data models and pipelines. Drive standardization, governance, observability, and CI/CD for analytics; partner with Data Science, Infrastructure, and Product to certify metrics, modernize orchestration, and integrate AI-native tooling.
Top Skills: AirflowDbtPythonSpark SqlSQL
An Hour Ago
Easy Apply
Remote or Hybrid
New York, NY, USA
Easy Apply
180K-275K Annually
Senior level
180K-275K Annually
Senior level
Healthtech • Information Technology • Software • Telehealth
As a Staff Software Engineer for the Data Platform, you will lead the evolution of Zocdoc's data infrastructure, focusing on data contracts, APIs, governance, and platform design for scalable data consumption.
Top Skills: AWSDatabricksDelta LakeEmrIcebergKafkaPythonSnowflakeSparkSQL

What you need to know about the Los Angeles Tech Scene

Los Angeles is a global leader in entertainment, so it’s no surprise that many of the biggest players in streaming, digital media and game development call the city home. But the city boasts plenty of non-entertainment innovation as well, with tech companies spanning verticals like AI, fintech, e-commerce and biotech. With major universities like Caltech, UCLA, USC and the nearby UC Irvine, the city has a steady supply of top-flight tech and engineering talent — not counting the graduates flocking to Los Angeles from across the world to enjoy its beaches, culture and year-round temperate climate.

Key Facts About Los Angeles Tech

  • Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
  • Key Industries: Artificial intelligence, adtech, media, software, game development
  • Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
  • Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account