CopilotIQ Logo

CopilotIQ

Senior Data Engineer

Reposted 12 Days Ago
Remote
Hiring Remotely in USA
Senior level
Remote
Hiring Remotely in USA
Senior level
Responsible for designing and optimizing data pipelines and data infrastructure, ensuring data quality and supporting business intelligence needs through collaboration and best practices implementation.
The summary above was generated by AI

At the forefront of health tech innovation, CopilotIQ is transforming in-home care with the industry's first AI-driven platform that supports individuals through every stage of their health journey-from pre-surgical optimization to acute, post-acute and chronic care. We are helping people live healthier, longer lives by bringing personalized, proactive care directly into their homes.  If you're passionate about driving real change in healthcare, join the Team!

What is the Data Engineer role?

CopilotIQ is seeking a Data Engineer to join our team as an individual contributor in a remote capacity. In this hands-on role, you will be responsible for designing, developing, and maintaining robust data pipelines, as well as optimizing our data warehouse and data lake infrastructure. You will also support the organization’s business intelligence needs by ensuring the availability, reliability, and quality of data across the ecosystem. We are looking for a highly collaborative and detail-oriented professional with strong problem-solving skills and a passion for building efficient, scalable data solutions that drive impactful insights.

What you’ll be doing:

  • Operate and own production data pipelines, with a focus on data quality—reliability, accuracy, timeliness—and rapid incident resolution.
  • Design, build, and optimize batch and streaming pipelines in Airflow, landing data in our Redshift-based warehouse.
  • Lead hands-on development. Build, test, and deploy high-throughput Airflow workflows and Python/SQL transformations; introduce DataOps practices (CI/CD, versioned schemas, automated data quality tests) to raise the engineering bar.
  • Ensure rigorous governance, lineage, and privacy controls across data platforms.
  • Write clean, efficient, and high-performance code in Python and SQL for data transformations and automation workflows.
  • Partner closely with business stakeholders to support reporting and analytics through BI tools such as Sigma Computing and Superset.
  • Optimize relentlessly. Monitor performance, scalability, and infrastructure cost; tune queries, caching, and compression. Introduce new value-creating technologies (e.g., Redshift Serverless, Kinesis, Iceberg).
  • Champion best practices in data engineering, including automation, data security, and operational excellence.
  • Collaborate cross-functionally with analysts and software engineers to design and implement scalable, production-grade data solutions.

Requirements

  • 5+ years of dedicated hands-on experience in designing, building, and operating production data pipelines and warehouses.
  • Bachelor’s or Master’s in Computer Science.
  • Programming & query languages: expert-level Python for data processing/automation and advanced SQL for analytical workloads.
  • Core data-engineering stack: orchestration with Airflow (or Prefect); distributed processing with Apache Spark; AWS services (Redshift, Glue, Athena, S3, DMS, Lambda, and SQS); and at least one streaming technology such as Kinesis or Kafka.
  • Deep understanding of dimensional / event-driven data modeling, partitioning, and performance tuning at terabyte-to-petabyte scale; comfortable balancing performance, cost, and governance in petabyte-scale environments.
  • Foundational knowledge of NoSQL technologies such as MongoDB and DynamoDB.
  • Excellent problem-solving skills, strong attention to detail, and the ability to thrive in a fast-paced, dynamic environment.
  • A mission-driven, collaborative mindset with strong product thinking and a desire to learn, grow, and make meaningful technical contributions.
Bonus Points For
  • AWS Certified Data Analytics – Specialty or Solutions Architect – Professional.
  • Administering access control, datasets, and embedded dashboards in BI tools such as Sigma Computing and Superset.
  • Experience rolling out DataOps practices (CI/CD for data, automated quality tests, lineage/observability tooling).
  • Leading technical initiatives in a growing team.

Top Skills

Airflow
Spark
Athena
AWS
Dms
DynamoDB
Glue
Kafka
Kinesis
Lambda
MongoDB
Python
Redshift
S3
Sigma Computing
SQL
Sqs
Superset

Similar Jobs

10 Hours Ago
In-Office or Remote
San Francisco, CA, USA
137K-214K Annually
Senior level
137K-214K Annually
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
The role involves building data solutions, mentoring junior engineers, partnering with teams to identify data needs, and ensuring data quality.
Top Skills: Data ArchitectureData ModelsData PipelinesData Solutions
10 Hours Ago
In-Office or Remote
San Francisco, CA, USA
137K-214K Annually
Senior level
137K-214K Annually
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
The Senior Data Engineer will build data solutions, develop scalable data pipelines, mentor junior engineers, and ensure data quality.
Top Skills: Data ArchitectureData ModelsData PipelinesData Quality ChecksData Solutions
10 Hours Ago
In-Office or Remote
San Francisco, CA, USA
137K-214K Annually
Senior level
137K-214K Annually
Senior level
Cloud • Information Technology • Productivity • Security • Software • App development • Automation
The Senior Data Engineer will build scalable data solutions, develop data pipelines, design data models, and mentor junior team members while partnering with multiple stakeholders.
Top Skills: Data ArchitectureData EngineeringData ModelingData PipelinesData Quality Checks

What you need to know about the Los Angeles Tech Scene

Los Angeles is a global leader in entertainment, so it’s no surprise that many of the biggest players in streaming, digital media and game development call the city home. But the city boasts plenty of non-entertainment innovation as well, with tech companies spanning verticals like AI, fintech, e-commerce and biotech. With major universities like Caltech, UCLA, USC and the nearby UC Irvine, the city has a steady supply of top-flight tech and engineering talent — not counting the graduates flocking to Los Angeles from across the world to enjoy its beaches, culture and year-round temperate climate.

Key Facts About Los Angeles Tech

  • Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
  • Key Industries: Artificial intelligence, adtech, media, software, game development
  • Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
  • Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account