Sharecare Logo

Sharecare

Sr. Data Engineer - Remote

Posted 12 Days Ago
Remote
Hiring Remotely in USA
Senior level
Remote
Hiring Remotely in USA
Senior level
The Senior Data Engineer will design and maintain data pipelines, develop ETL solutions, and support AI/ML initiatives, collaborating with various teams.
The summary above was generated by AI

Job Description:

Sharecare is a digital healthcare company that delivers software and tech-enabled services to stakeholders across the healthcare ecosystem to help improve care quality, drive better outcomes, and lower costs. Through its data-driven AI insights, evidence-based resources, and comprehensive platform – including benefits navigation, care management, home care resources, health information management, and more – Sharecare helps people easily and efficiently manage their healthcare and improve their well-being. Across its three business channels, Sharecare enables health plan sponsors, health systems and physician practices, and leading pharmaceutical brands to drive personalized and value-based care at scale. To learn more, visit www.sharecare.com.

Job Summary:

Sharecare is seeking a Senior Data Engineer to help build and evolve our next-generation data platform supporting high-profile partners and customers. In this role, you will collaborate closely with Product, Account Management, QA, Analytics, and Architecture teams to deliver scalable, reliable, and secure data solutions.

This is a hands-on, high-impact position within a fast-paced, agile environment, offering the opportunity to develop breakthrough solutions in the health information and digital health space. The ideal candidate is self-driven, analytical, and detail-oriented, with deep expertise in Python-based data engineering, modern orchestration frameworks, and cloud-native architectures.

Essential Job Functions:

  • Design, build, and maintain scalable data pipelines using Python, Apache Airflow, and Apache Spark
  • Analyze business and technical requirements and translate them into reliable, future-proof data solutions
  • Develop, validate, deploy, and support complex ETL/ELT pipelines at scale
  • Build clean, secure, and maintainable REST APIs following company standards
  • Implement real-time and batch processing solutions for diverse data sources
  • Develop reusable data engineering and AI frameworks for enterprise-wide adoption
  • Define and manage domain-based “source of truth” data models, ensuring scalability and end-to-end data lineage
  • Implement data governance practices, automate data quality checks, and enable pipeline testing
  • Optimize data infrastructure for performance, cost efficiency, and reliability
  • Manage source control, CI/CD pipelines, and production deployments
  • Partner with data scientists and analysts to support AI/ML initiatives

Specific Skills/ Attributes:

  • Strong problem-solving and analytical thinking
  • Excellent communication and collaboration skills across technical and non-technical teams
  • Ability to work independently as well as within cross-functional, agile teams
  • Strong organizational and time-management skills
  • Ability to balance big-picture thinking with attention to technical detail
  • Flexibility and adaptability in a rapidly evolving environment

Qualifications:

  • Bachelor’s degree (or higher) in Computer Science, Data Engineering, or a related field
  • 10+ years of experience in data engineering or related roles
  • Strong proficiency in Python and related libraries (Pandas, SQLAlchemy, Boto3, Paramiko, Flask, FastAPI)
  • Advanced SQL skills with experience analyzing healthcare datasets (e.g., claims, provider directories)
  • Hands-on experience with orchestration tools such as Apache Airflow and Databricks Workflows
  • Strong experience with Apache Spark; exposure to Apache Flink is a plus
  • Experience integrating with .NET applications and working with SQL Server
  • Cloud platform experience (AWS, Azure, or GCP required)
  • Experience with data warehousing solutions such as Amazon Redshift or Vertica
  • Solid understanding of distributed systems and functional programming concepts
  • Proficiency with Git and modern CI/CD practices
  • Exposure to streaming technologies, containerization, and ML pipelines preferred
  • Familiarity with AI tools and large language models is a plus

Sharecare and its subsidiaries are Equal Opportunity Employers and E-Verify users. Qualified applicants will receive consideration for employment without regard to race, color, sex, national origin, sexual orientation, gender identity, religion, age, equal pay, disability, genetic information, protected veteran status, or other status protected under applicable law.

Top Skills

Amazon Redshift
Apache Airflow
Spark
AWS
Azure
Databricks
GCP
Git
Python
SQL

Similar Jobs

2 Days Ago
In-Office or Remote
Brookfield, WI, USA
105K-199K Annually
Senior level
105K-199K Annually
Senior level
Healthtech • HR Tech • Insurance • Consulting
Design and implement enterprise Databricks/AWS data platform and lakehouse solutions for healthcare data. Build and optimize ETL pipelines (Delta Live Tables), medallion architectures, streaming and batch processing, data models (FHIR/OMOP), governance (Unity Catalog), security for PHI/PII, IaC with Terraform, and collaborate with cross-functional teams to deliver reliable, cost-effective external and internal data products.
Top Skills: Databricks,Unity Catalog,Delta Live Tables,Terraform,Asset Bundles,Aws,Sql,Python,Scala,Sql Server,Postgresql,Mongodb,Fivetran,Github,Apis,Delta Lake,Medallion Architecture,Fhir,Omop,C-Cda,Hl7,Icd10,Snomed,Ndc,Loinc,Rxnorm
5 Days Ago
Remote
USA
150K-170K Annually
Senior level
150K-170K Annually
Senior level
HR Tech • Insurance • Analytics • Consulting
Design, build, and operate cloud-native data platforms and scalable pipelines using Snowflake and dbt. Lead orchestration, CI/CD, observability, and performance tuning. Collaborate on backend services and limited frontend work, mentor engineers, define modeling standards, and participate in multi-team architecture and system design.
Top Skills: Snowflake,Dbt,Airflow,Dagster,Prefect,Redshift,Python,Golang,Sql,Datadog,Prometheus,Opentelemetry,Azure,Aks,Salesforce
5 Days Ago
Remote
USA
Senior level
Senior level
Internet of Things • Mobile • Other
Responsible for deploying, upgrading, and maintaining High-Speed Data infrastructure, focusing on operational management, troubleshooting, and engineering solutions for critical Internet services.
Top Skills: AIAnsibleBashBluecatCalixCentosDhcpDnsDockerDocsisInfobloxKentikLinuxLlmNmsPonPythonRhelSolarwinds

What you need to know about the Los Angeles Tech Scene

Los Angeles is a global leader in entertainment, so it’s no surprise that many of the biggest players in streaming, digital media and game development call the city home. But the city boasts plenty of non-entertainment innovation as well, with tech companies spanning verticals like AI, fintech, e-commerce and biotech. With major universities like Caltech, UCLA, USC and the nearby UC Irvine, the city has a steady supply of top-flight tech and engineering talent — not counting the graduates flocking to Los Angeles from across the world to enjoy its beaches, culture and year-round temperate climate.

Key Facts About Los Angeles Tech

  • Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
  • Key Industries: Artificial intelligence, adtech, media, software, game development
  • Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
  • Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account