Builder Prime Logo

Builder Prime

Data Engineer

Reposted 3 Days Ago
Remote
Hiring Remotely in United States
125K-145K Annually
Mid level
Remote
Hiring Remotely in United States
125K-145K Annually
Mid level
The Data Engineer will design and maintain scalable data pipelines and transformations, implement real-time/batch data ingestion workflows, optimize data models, and ensure data quality and performance, collaborating with analytics and engineering teams.
The summary above was generated by AI
THE COMPANY

At Builder Prime, we’re revolutionizing the home improvement industry by doing more than just building software—we're giving hardworking contractors their time and freedom back. Our all-in-one business management solution is the essential platform for companies at their "level-up" moment, seamlessly integrating CRM, estimating, production management, payments, and reporting. By replacing disjointed tools with a unified data and automation foundation, we empower businesses to operate more efficiently, win more jobs, and achieve their growth goals. We look for collaborative and community-oriented team members who bring initiative and curiosity to the challenge of solving problems for our customers.

In 2023, Builder Prime raised their Series A financing from Blueprint Equity and additional investors. We recently announced our Series B, also financed by Blueprint Equity.

THE ROLE

This is a remote position open to candidates authorized to work in the US without visa sponsorship, who presently reside in the US. There is a preference for candidates in states where we already have Builder Prime employees; California, Colorado, Florida, Georgia, Idaho, Indiana, Nebraska, New York, Ohio, Oregon, Texas, Utah, Virginia, Washington, and Wisconsin. Builder Prime has an office in the greater Denver metro area (Westminster) that the majority of the sales team works out of.

We are seeking an experienced Data Engineer to join our team at Builder Prime. This role requires deep technical expertise in SQL, data transformation pipelines, and modern cloud data platforms. You'll work on critical data infrastructure that powers our analytics and business intelligence capabilities, with opportunities to tackle challenging technical problems involving both real-time streaming and batch data architectures.

This position offers a unique opportunity to work in a collaborative, remote-friendly environment where data quality and reliability are paramount. You'll be part of a team that values deep technical curiosity, methodical problem-solving, and the ability to balance pragmatic solutions with technical excellence. We're looking for someone who not only has strong technical skills but can also communicate complex concepts effectively and contribute to our modern data stack using cutting-edge tools.

This role will report directly to our Staff Data Engineer and will sit on the Engineering team.

THE DAY-TO-DAY

  • Design, develop, and maintain scalable data pipelines and transformations

  • Build and optimize complex SQL queries and data models

  • Implement and manage real-time and batch data ingestion workflows using CDC

  • Orchestrate end-to-end data pipelines with monitoring and alerting

  • Build reports and dashboards in BI tools to support business analytics

  • Collaborate with analytics and engineering teams to deliver reliable data solutions

  • Ensure data quality, consistency, and reconciliation across the pipeline

  • Monitor and optimize data platform performance and costs

  • Ensure data quality, reliability, and governance standards

THE MUST-HAVES

Core Technical Skills:

  • SQL Proficiency (Expert): Advanced query optimization, performance tuning, and deep understanding of database internals, complex analytical queries (window functions, CTEs), query plan analysis, and index optimization.

  • Database Fundamentals (Strong): Expertise in join types (sort-merge, hash, nested loop), index vs. sequential scan tradeoffs, query optimization, materialized views, incremental computation, transaction isolation, and concurrency control.

  • Python (Proficient): Writing scripts for data processing/transformation, using data libraries (pandas, polars), API integrations, automation, and code quality/testing.

  • Change Data Capture (CDC) (Strong): Experience with CDC tools (Debezium, AWS DMS, etc.), log-based replication, handling schema evolution, understanding at-least-once/exactly-once semantics, idempotency, and backfilling historical data.

  • dbt (3+ years): Building and scaling dbt projects, semantic layer/dimensional modeling, incremental models/materialization, testing, documentation, CI/CD for analytics, and package management.

  • Snowflake (3+ years): Expertise in S3 integration/external stages, Snowpipe (continuous and streaming), warehouse sizing/cost optimization, serverless features/task orchestration, data sharing, performance monitoring, governance (resource monitors), data lifecycle management, and understanding micro-partitions/clustering.

  • Core Concepts (Foundational): Understanding event time vs. processing time, watermarks/late data handling, backfilling strategies, stream-to-stream joins, and windowing operations (tumbling, sliding, session).

  • BI Tool Experience (Required): Building reports, dashboards, and visualizations in modern BI platforms (Looker, Tableau, Power BI, Metabase, etc.). Translating business needs into data models/metrics, optimizing data models for performance, and collaborating with stakeholders.

  • Analytics Partnership: Cross-functional collaboration with analytics/business teams, data model documentation (data dictionaries), supporting ad-hoc analysis, and balancing flexibility with governance.

Experience:

  • 3+ years of production data engineering experience

  • 3+ years working with dbt or similar transformation frameworks

  • 3+ years hands-on experience with Snowflake

  • Experience building CDC pipelines and handling real-time data ingestion

  • Demonstrated ability to build reports and dashboards in BI tools

  • Track record of building and maintaining data pipelines at scale

  • Experience with data quality validation and reconciliation

  • Track record of optimizing data platform performance and costs

THE NICE-TO-HAVES

  • Experience with data orchestration tools (Airflow, Dagster, Prefect, or similar)

  • AWS services (S3, RDS, DMS, EKS)

  • PostgreSQL administration and replication

  • Kubernetes and containerization

  • Real-time materialized views and incremental view maintenance

  • Data warehouse optimization for columnar storage

  • Experience with streaming SQL transformations

  • Understanding of eventual consistency and distributed systems

  • Data quality frameworks and testing tools (Great Expectations, Soda, etc.)

  • Understanding of platforms like Kafka, Flink, or similar

  • Knowledge or experience in SaaS and the home improvement industry.

  • Strong understanding of business management CRM software and its application in improving business processes.

THE BENEFITS

Why You'll Love Working Here:

  • Exceptional Health Coverage – We cover 95% of medical, dental, and vision premiums for employees (Aetna & Guardian), plus 50% for dependents

  • Strong Retirement Match – 401k with up to 4% company match that's immediately vested (no waiting period!)

  • Generous Time Off – 15 PTO days, 48 hours sick time, 9 paid holidays, plus a paid volunteer day to give back to your community

  • Annual International Retreat – All-expenses-paid company trip to connect with teammates in amazing destinations (2026: Cancun, Mexico!)

  • Life Happens Coverage – Company-paid life insurance ($50k), AD&D, and short-term disability, plus optional accident, hospital, critical illness, and additional life insurance

  • Grow With Us – $300 annual professional development stipend, plus an incredible sabbatical at 5 years (2 extra weeks PTO + $1,000 travel bonus)

  • Comprehensive Parental Leave – 10 weeks paid parental leave for birthing parents, 4 weeks for non-birthing parents

  • Fully Equipped From Day One – Company-provided laptop and all the equipment you need to succeed

PLUS…

  • Room to Grow: We LOVE to promote from within. Show us your best stuff, and you’ll have ample opportunity to grow and advance.

  • Flexible Work Arrangements: We are currently a mix of hybrid and fully remote roles across multiple time zones to support work-life balance.

  • We Hate Red Tape: We never restrict our team’s creativity, so you’ll have the freedom to experiment and test out new ideas. At Builder Prime, “Iterate To Innovate” is a foundational philosophy.

  • Strong Culture: Through the use of platforms like Bonusly, the Coffee Chat slack app, and other events and initiatives (both virtual and in-person), we strive to make work both a place where you can do the best work of your career AND have fun. There is no reason it can't be both!

This position will pay a base salary of $125,000-$145,000 depending on experience.

Builder Prime is an Equal Opportunity Employer and values diversity at all levels. We also participate in E-Verify to confirm employment eligibility in the U.S.

Top Skills

Bi Tools
Change Data Capture
Dbt
Python
Snowflake
SQL

Similar Jobs

Yesterday
Easy Apply
Remote or Hybrid
USA
Easy Apply
175K-225K Annually
Mid level
175K-225K Annually
Mid level
Fintech • Information Technology • Software • Financial Services
The Data Engineer will build real-time data pipelines for pricing algorithms, collaborate with teams, and contribute to batch data workflows.
Top Skills: Cloud-Based Distributed Data InfrastructureFlinkKafkaPythonSQL
Yesterday
Easy Apply
Remote or Hybrid
United States
Easy Apply
102K-154K Annually
Mid level
102K-154K Annually
Mid level
Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
As a Data Engineer II, you will build and optimize data pipelines in Databricks, ensure high-quality data delivery, and support generative AI applications. You will collaborate with data scientists and AI engineers to provide reliable data infrastructures and meet business needs.
Top Skills: DatabricksDbtGainsightGongOutreachPythonRdsRedshiftSalesforceSnowflakeSparkSQL
7 Days Ago
Easy Apply
Remote or Hybrid
USA
Easy Apply
158K-205K Annually
Senior level
158K-205K Annually
Senior level
Food • Software
The Senior Data Engineer handles ChowNow's data platform, collaborating with teams to enhance data availability and insights, supporting internal and customer-facing products.
Top Skills: AWSDbtPythonSnowflakeSQL

What you need to know about the Los Angeles Tech Scene

Los Angeles is a global leader in entertainment, so it’s no surprise that many of the biggest players in streaming, digital media and game development call the city home. But the city boasts plenty of non-entertainment innovation as well, with tech companies spanning verticals like AI, fintech, e-commerce and biotech. With major universities like Caltech, UCLA, USC and the nearby UC Irvine, the city has a steady supply of top-flight tech and engineering talent — not counting the graduates flocking to Los Angeles from across the world to enjoy its beaches, culture and year-round temperate climate.

Key Facts About Los Angeles Tech

  • Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
  • Key Industries: Artificial intelligence, adtech, media, software, game development
  • Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
  • Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account