About COVU
COVU is a venture-backed technology startup transforming the insurance industry. We empower independent agencies with AI-driven insights and digitized operations, enabling them to manage risk more effectively. Our team is building an AI-first company set to redefine the future of insurance distribution.
Location:
This role can be hybrid or remote. If the candidate is based in the Los Angeles (LA) area, it will be a hybrid role working from our office in West Hollywood. For candidates based anywhere else in the US, this will be a fully remote role.
The Role
We are looking for a crafty, execution-focused Data Engineer to join our Platform team. We’ve spent the last year building the foundation of COVU Connect - our proprietary data warehouse. Now, we are moving into a high-velocity phase: scaling the Golden Policy and Account records to power our AI-native operational platform, COVU OS.
This is not a role for someone who wants to spend months in "discovery." We need a builder who thrives in a defined architectural landscape, leverages AI tools (like Gemini-CLI) to ship code faster, and understands that speed-to-market is our most important KPI. You will work directly with our Lead Data Engineer to transform complex insurance logic into performant, automated pipelines.
What You’ll Do (The Mission)
- Execute the "Golden Records": Be the primary builder of the Golden Policy Journal and Golden Account Cluster. You will implement the harmonization and arbitration logic that turns messy carrier data into our single source of truth.
- AI-Augmented Development: Proactively use AI tooling (Gemini, Copilot, etc.) to accelerate ETL development, unit testing, and documentation. We value "smart speed."
- Build & Optimize Pipelines: Develop and maintain robust DAGs in Airflow and models in dbt to ensure our data is processed with high integrity and point-in-time accuracy.
- Operational Excellence: Implement "quarantine" logic for bad data and build reconciliation triggers to ensure our internal AMS matches our "Golden" state.
- Modernize & Refactor: Work within our Python-based framework to systematically replace legacy processes, ensuring every line of code is modular and SOC2 compliant.
- Collaborate via Agile: Participate in tight feedback loops with Product and Tech Leads to deliver comprehensive data integration.
What We’re Looking For
- 3–5 years of experience in data engineering. You’ve moved past the "learning" phase and are now focused on high-quality delivery. With demonstrated ownership of production pipelines - not just contributing to them.
- SQL & Python Fluency: You can write complex analytical SQL and clean, modular Python in your sleep.
- Modern Data Stack Experience: Hands-on experience with Snowflake and dbt.
- You understand how to build dimensional models that don’t just store data but solve business problems.
- You've optimized queries, managed costs, not just run SELECT statements.
- You understand project structure, testing strategies, incremental models, and materializations.
- Orchestration Skills: You've built and debugged DAGs in production, understand task dependencies, retries, and have dealt with scheduling or executor issues firsthand.
- AI-Native Mindset: You are comfortable using (and want to use) AI tools to handle boilerplate code, debug complex queries, and speed up your workflow.
- Communication: You can explain why a pipeline failed and how you’re fixing it without needing jargon gymnastics.
- Pragmatism: You know when to build a "perfect" solution and when to build a “tactical” solution that works today and scales tomorrow.
- Ownership and Trust: You are able to power ahead on your own and seek guidance when you need it, not when you're stuck for a week.
- Operational instincts: You monitor what you build. You've set up alerts, investigated pipeline failures, and implemented fixes that prevent recurrence.
Bonus Points For
- Insurtech Background: If you know the difference between an Endorsement and a Reinstatement, or if you’ve ever wrestled with AMS/AL3 data, we want to talk to you.
- AWS Familiarity: Experience with S3, Lambda, ECS, RDS.
- Legacy Parsing: Ability to read/understand Java or SQL-based ETLs to help migrate them into our new Python/Snowflake environment.
Why COVU?
We are past the "experimental" stage. We have a clear vision, a stabilized architecture, and a market that is hungry for our platform. You’ll be joining a team where your work directly impacts the operational success of insurance agencies across the country.
Application Process:
- Intro call with People team
- Technical interviews
- Final interview with leaders
Please be strictly advised that the use of any real-time AI assistance, screen-reading software, or external aids during the application process is strictly prohibited. We employ active detection methods to ensure the integrity of our hiring process. Any violation of this policy will result in the immediate termination of the interview and permanent disqualification of your candidacy.
Top Skills
Similar Jobs
What you need to know about the Los Angeles Tech Scene
Key Facts About Los Angeles Tech
- Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
- Key Industries: Artificial intelligence, adtech, media, software, game development
- Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
- Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering



