At the forefront of health tech innovation, CopilotIQ+Biofourmis is transforming in-home care with the industry's first AI-driven platform that supports individuals through every stage of their health journey-from pre-surgical optimization to acute, post-acute and chronic care. We are helping people live healthier, longer lives by bringing personalized, proactive care directly into their homes. With CopilotIQ's commitment to enhancing the lives of seniors with chronic conditions and Biofourmis' advanced data-driven insights and virtual care solutions, we're setting a new standard in accessible healthcare. If you're passionate about driving real change in healthcare, join the CopilotIQ+Biofourmis Team!
What is the Data Engineer role?
CopilotIQ is seeking a Data Engineer to join our team as an individual contributor in a remote capacity. In this hands-on role, you will be responsible for designing, developing, and maintaining robust data pipelines, as well as optimizing our data warehouse and data lake infrastructure. You will also support the organization’s business intelligence needs by ensuring the availability, reliability, and quality of data across the ecosystem. We are looking for a highly collaborative and detail-oriented professional with strong problem-solving skills and a passion for building efficient, scalable data solutions that drive impactful insights.
What you’ll be doing:
- Operate and own production data pipelines, with a focus on data quality—reliability, accuracy, timeliness—and rapid incident resolution.
- Design, build, and optimize batch and streaming pipelines in Airflow, landing data in our Redshift-based warehouse.
- Lead hands-on development. Build, test, and deploy high-throughput Airflow workflows and Python/SQL transformations; introduce DataOps practices (CI/CD, versioned schemas, automated data quality tests) to raise the engineering bar.
- Ensure rigorous governance, lineage, and privacy controls across data platforms.
- Write clean, efficient, and high-performance code in Python and SQL for data transformations and automation workflows.
- Partner closely with business stakeholders to support reporting and analytics through BI tools such as Sigma Computing and Superset.
- Optimize relentlessly. Monitor performance, scalability, and infrastructure cost; tune queries, caching, and compression. Introduce new value-creating technologies (e.g., Redshift Serverless, Kinesis, Iceberg).
- Champion best practices in data engineering, including automation, data security, and operational excellence.
- Collaborate cross-functionally with analysts and software engineers to design and implement scalable, production-grade data solutions.
Requirements
- 5+ years of dedicated hands-on experience in designing, building, and operating production data pipelines and warehouses.
- Bachelor’s or Master’s in Computer Science.
- Programming & query languages: expert-level Python for data processing/automation and advanced SQL for analytical workloads.
- Core data-engineering stack: orchestration with Airflow (or Prefect); distributed processing with Apache Spark; AWS services (Redshift, Glue, Athena, S3, DMS, Lambda, and SQS); and at least one streaming technology such as Kinesis or Kafka.
- Deep understanding of dimensional / event-driven data modeling, partitioning, and performance tuning at terabyte-to-petabyte scale; comfortable balancing performance, cost, and governance in petabyte-scale environments.
- Foundational knowledge of NoSQL technologies such as MongoDB and DynamoDB.
- Excellent problem-solving skills, strong attention to detail, and the ability to thrive in a fast-paced, dynamic environment.
- A mission-driven, collaborative mindset with strong product thinking and a desire to learn, grow, and make meaningful technical contributions.
- AWS Certified Data Analytics – Specialty or Solutions Architect – Professional.
- Administering access control, datasets, and embedded dashboards in BI tools such as Sigma Computing and Superset.
- Experience rolling out DataOps practices (CI/CD for data, automated quality tests, lineage/observability tooling).
- Leading technical initiatives in a growing team.
Top Skills
Similar Jobs
What you need to know about the Los Angeles Tech Scene
Key Facts About Los Angeles Tech
- Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
- Key Industries: Artificial intelligence, adtech, media, software, game development
- Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
- Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering