We are seeking a Senior Software Engineer with deep expertise in data pipeline engineering to join our team. In this role, you will design, build, and maintain highly scalable data pipelines that enable secure, reliable, and performant data movement across systems. You will collaborate closely with product, data, and infrastructure teams to architect and deliver solutions that support advanced analytics, business intelligence, and business-critical applications.
The ideal candidate brings a software engineering mindset to data problems, combining strong programming fundamentals with hands-on experience building resilient pipelines, streaming workflows, and ETL/ELT frameworks.
What You’ll Do
Design & Development
Architect, implement, and optimize batch and streaming data pipelines to move, transform, and process structured and unstructured data at scale.
Apply software engineering best practices (code reviews, testing, CI/CD, version control) to data pipeline development.
Ensure pipelines are modular, reusable, and extensible.
Data Operations & Reliability
Build monitoring, logging, and alerting frameworks to ensure pipeline reliability and data quality.
Implement data validation, schema evolution handling, and error recovery mechanisms.
Troubleshoot production issues and perform root cause analysis.
Collaboration & Leadership
Partner with delivery teams to understand requirements and deliver end-to-end solutions.
Mentor junior engineers, set coding standards, and advocate for best practices in pipeline development.
Contribute to technical design reviews, architectural decisions, and long-term data platform strategy.
Scalability & Performance
Optimize pipelines for high-volume, low-latency data processing.
Evaluate and implement modern frameworks and cloud-native solutions (e.g., Databricks, Kafka, Airflow).
Ensure systems can handle future growth and evolving data use cases.
What You’ll Need
Minimum Qualifications
Bachelor’s or Master’s degree in Computer Science, Engineering, or related field, or equivalent practical experience.
2+ years of software engineering experience with at least
Preferred Qualifications
3+ years experience working on data pipelines, data integration, or ETL/ELT systems.
Strong skills in SQL with the ability to design and optimize complex queries.
Hands-on experience with data pipeline frameworks (e.g., Apache Airflow, dbt, Dagster).
Expertise in streaming technologies (e.g., Kafka, Kinesis).
Deep understanding of databases and data warehouses (e.g., PostgreSQL, Snowflake, Aurora MySQL).
Strong knowledge of cloud platforms (AWS or Azure), including storage, compute, and serverless data services.
Proven ability to design for scalability, performance, and reliability in production-grade pipelines.
Experience with infrastructure-as-code (Terraform, CloudFormation) and containerization (Docker, Kubernetes).
Familiarity with data governance, lineage, and cataloging tools.
Understanding of sustainability reporting, disclosure practices, or compliance-related data workflows.
Contributions to open-source data frameworks.
Strong communication skills and ability to influence cross-functional teams.
Travel Requirements & Working Conditions
Ability to travel up to 15% of the time for internal and team meetings or conferences
Reliable internet access for any period of time working remotely, not in a Workiva office
How You’ll Be Rewarded
✅ Salary range in the US: $129,000.00 - $207,000.00✅ A discretionary bonus typically paid annually
✅ Restricted Stock Units granted at time of hire
✅ 401(k) match and comprehensive employee benefits package
The salary range represents the low and high end of the salary range for this job in the US. Minimums and maximums may vary based on location. The actual salary offer will carefully consider a wide range of factors, including your skills, qualifications, experience and other relevant factors.
Employment decisions are made without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other protected characteristic.
Workiva is committed to working with and providing reasonable accommodations to applicants with disabilities. To request assistance with the application process, please email [email protected].
Workiva employees are required to undergo comprehensive security and privacy training tailored to their roles, ensuring adherence to company policies and regulatory standards.
Workiva supports employees in working where they work best - either from an office or remotely from any location within their country of employment.
#LI-HS1Top Skills
Similar Jobs at Workiva
What you need to know about the Los Angeles Tech Scene
Key Facts About Los Angeles Tech
- Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
- Key Industries: Artificial intelligence, adtech, media, software, game development
- Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
- Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering