CopilotIQ Logo

CopilotIQ

Senior Data Engineer

Posted 11 Days Ago
Remote
Hiring Remotely in USA
50K-120K
Senior level
Remote
Hiring Remotely in USA
50K-120K
Senior level
As a Senior Data Engineer, you'll design, develop, and maintain data pipelines, optimize warehouses, and support business intelligence needs, ensuring data quality and governance.
The summary above was generated by AI

At the forefront of health tech innovation, CopilotIQ+Biofourmis is transforming in-home care with the industry's first AI-driven platform that supports individuals through every stage of their health journey-from pre-surgical optimization to acute, post-acute and chronic care. We are helping people live healthier, longer lives by bringing personalized, proactive care directly into their homes. With CopilotIQ's commitment to enhancing the lives of seniors with chronic conditions and Biofourmis' advanced data-driven insights and virtual care solutions, we're setting a new standard in accessible healthcare. If you're passionate about driving real change in healthcare, join the CopilotIQ+Biofourmis Team!

What is the Data Engineer role?

CopilotIQ is seeking a Data Engineer to join our team as an individual contributor in a remote capacity. In this hands-on role, you will be responsible for designing, developing, and maintaining robust data pipelines, as well as optimizing our data warehouse and data lake infrastructure. You will also support the organization’s business intelligence needs by ensuring the availability, reliability, and quality of data across the ecosystem. We are looking for a highly collaborative and detail-oriented professional with strong problem-solving skills and a passion for building efficient, scalable data solutions that drive impactful insights.

What you’ll be doing:

  • Operate and own production data pipelines, with a focus on data quality—reliability, accuracy, timeliness—and rapid incident resolution.
  • Design, build, and optimize batch and streaming pipelines in Airflow, landing data in our Redshift-based warehouse.
  • Lead hands-on development. Build, test, and deploy high-throughput Airflow workflows and Python/SQL transformations; introduce DataOps practices (CI/CD, versioned schemas, automated data quality tests) to raise the engineering bar.
  • Ensure rigorous governance, lineage, and privacy controls across data platforms.
  • Write clean, efficient, and high-performance code in Python and SQL for data transformations and automation workflows.
  • Partner closely with business stakeholders to support reporting and analytics through BI tools such as Sigma Computing and Superset.
  • Optimize relentlessly. Monitor performance, scalability, and infrastructure cost; tune queries, caching, and compression. Introduce new value-creating technologies (e.g., Redshift Serverless, Kinesis, Iceberg).
  • Champion best practices in data engineering, including automation, data security, and operational excellence.
  • Collaborate cross-functionally with analysts and software engineers to design and implement scalable, production-grade data solutions.

Requirements

  • 5+ years of dedicated hands-on experience in designing, building, and operating production data pipelines and warehouses.
  • Bachelor’s or Master’s in Computer Science.
  • Programming & query languages: expert-level Python for data processing/automation and advanced SQL for analytical workloads.
  • Core data-engineering stack: orchestration with Airflow (or Prefect); distributed processing with Apache Spark; AWS services (Redshift, Glue, Athena, S3, DMS, Lambda, and SQS); and at least one streaming technology such as Kinesis or Kafka.
  • Deep understanding of dimensional / event-driven data modeling, partitioning, and performance tuning at terabyte-to-petabyte scale; comfortable balancing performance, cost, and governance in petabyte-scale environments.
  • Foundational knowledge of NoSQL technologies such as MongoDB and DynamoDB.
  • Excellent problem-solving skills, strong attention to detail, and the ability to thrive in a fast-paced, dynamic environment.
  • A mission-driven, collaborative mindset with strong product thinking and a desire to learn, grow, and make meaningful technical contributions.
Bonus Points For
  • AWS Certified Data Analytics – Specialty or Solutions Architect – Professional.
  • Administering access control, datasets, and embedded dashboards in BI tools such as Sigma Computing and Superset.
  • Experience rolling out DataOps practices (CI/CD for data, automated quality tests, lineage/observability tooling).
  • Leading technical initiatives in a growing team.

Top Skills

Airflow
Spark
Athena
AWS
Dms
DynamoDB
Glue
Kafka
Kinesis
Lambda
MongoDB
Python
Redshift
S3
Sigma Computing
SQL
Sqs
Superset

Similar Jobs

4 Days Ago
Remote
Hybrid
United States
106K-196K Annually
Senior level
106K-196K Annually
Senior level
Artificial Intelligence • Cloud • Sales • Security • Software • Cybersecurity • Data Privacy
As a Senior Data Engineer, you'll design scalable data solutions and build data sets for AI and analytics, contributing to AI-driven products.
Top Skills: AirflowAWSCloudbeesDbtFeastJenkinsKafkaPythonQlikShell/BashSnowflakeSQLTableau
11 Days Ago
Remote
Atlanta, GA, USA
145K-200K
Senior level
145K-200K
Senior level
Fintech • Gaming • Mobile • Sports • Esports
As a Senior Data Engineer, you will enhance data platforms, develop integrations, manage cloud infrastructure, and mentor junior engineers to ensure high-quality data solutions.
Top Skills: AirflowAlloydbArgoBigQueryBigtableCloud ComposerData StreamDatadogDataprocElk StackFastapiFlinkGitGoGoogle Cloud Deployment ManagerGoogle Cloud PlatformGrafanaHevoKafkaKubernetesMaterializeNoSQLPostgresPrometheusPulsarPythonRedisSparkSQLStreamlitTerraform
11 Days Ago
Remote
Hybrid
7 Locations
168K-297K Annually
Senior level
168K-297K Annually
Senior level
Blockchain • eCommerce • Fintech • Payments • Software • Financial Services • Cryptocurrency
The Senior Data Engineer will lead the design and implementation of scalable data pipelines, improve data strategy, and collaborate with various teams to optimize data solutions.
Top Skills: AirflowDbtLookerModePrefectPythonSQL

What you need to know about the Los Angeles Tech Scene

Los Angeles is a global leader in entertainment, so it’s no surprise that many of the biggest players in streaming, digital media and game development call the city home. But the city boasts plenty of non-entertainment innovation as well, with tech companies spanning verticals like AI, fintech, e-commerce and biotech. With major universities like Caltech, UCLA, USC and the nearby UC Irvine, the city has a steady supply of top-flight tech and engineering talent — not counting the graduates flocking to Los Angeles from across the world to enjoy its beaches, culture and year-round temperate climate.

Key Facts About Los Angeles Tech

  • Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
  • Key Industries: Artificial intelligence, adtech, media, software, game development
  • Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
  • Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account