Keller Postman Logo

Keller Postman

Data Engineer

Posted 4 Days Ago
Remote
Hiring Remotely in USA
160K-170K
Senior level
Remote
Hiring Remotely in USA
160K-170K
Senior level
The Data Engineer will design, build, and maintain scalable data architectures and pipelines, using tools like Snowflake and Azure while ensuring data governance and collaborating with cross-functional teams.
The summary above was generated by AI

Keller Postman represents a broad array of clients in class and mass actions, individual arbitrations, and multidistrict litigation matters at the trial and appellate levels in federal and state courts. Serving hundreds of thousands of clients in litigation and arbitration, we have prosecuted high-profile mass tort, antitrust, privacy, product liability, employment, and consumer-rights cases. Our firm also acts as plaintiffs’ counsel in high-stakes public-enforcement actions. Our mission is to achieve exceptional results for our clients, drive innovation in the practice of law, and pursue unparalleled excellence in everything we do.

Purpose: The Data Engineer will play a critical role in our organization, focusing on the design, construction, installation, testing, and maintenance of highly scalable data management systems. This role requires hands-on expertise using Snowflake, Azure Cloud services, and Sigma to design, develop, and maintain scalable data pipelines and analytics solutions. The ideal candidate will have strong experience in cloud-based data architectures, modern ETL/ELT practices and technical expertise to support and improve current data systems and processes, as well as the ability to work collaboratively with cross-functional teams.

This is a remote position, but the ideal candidate will be on Central Standard Time or willing to work regular Central Standard Business hours; being located in the Chicago area is a plus. The compensation for this position is $160,000 to $170,000 per year, depending on experience, plus a year-end discretionary bonus and benefits.   

Essential Functions:

  • Develop, construct, test, and maintain data architectures, including databases and large-scale processing systems.
  • Design, build, and optimize data pipelines and ETL/ELT processes leveraging Snowflake and Azure Services.
  • Develop and maintain Snowflake data warehouses, ensuring efficient data modeling, partitioning, and performance tuning.
  • Implement data flow processes that automate and streamline data collection, processing, and analysis.
  • Ensure data governance, quality, and security best practices across all data platforms.
  • Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decisionmaking across the organization.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Provide operational support for existing data infrastructure and develop new solutions as needed.
  • Monitor, troubleshoot, and optimize system performance in Azure and Snowflake environments.
  • Support CI/CD pipelines and automation for data workflows and deployments.
  • Keep current with industry trends and innovations in data engineering and propose changes to the existing landscape.

Knowledge, Skills, Abilities:

  • Proficient in Snowflake, Databricks, or similar tools and experience in data warehousing.
  • Skilled in SQL, ETL design, and data modeling.
  • Proficiency in SQL (complex queries, stored procedures, optimization) and familiarity with Python for data engineering tasks.
  • Experience with Salesforce data integration is a plus.
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
  • Strong knowledge of ETL/ELT patterns, orchestration, and workflow automation.
  • Familiarity with Sigma Computing for reporting, data visualization, and business user self-service analytics.
  • Understanding of data governance, security, and compliance frameworks (e.g., GDPR, HIPAA).
  • Experience with streaming data technologies (Kafka, Event Hubs, or similar).
  • Exposure to DevOps practices and Infrastructure as Code (e.g., Terraform, ARM templates).
  • Adept at queries, report writing, and presenting findings.
  • Excellent problem-solving and troubleshooting skills.
  • Ability to work in a fast-paced environment and manage multiple projects simultaneously.
  • Strong communication skills, capable of conveying complex data issues to non-technical team members.

Education/Experience:

  • Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field.
  • A minimum of 5 years of experience in a data engineering role.
  • Experience working with Azure cloud services and data warehousing technologies.
  • Relevant certifications in Azure or other cloud technologies are beneficial.

Language Ability:

  • Must be able to read, write, and speak fluent English

Keller Postman is an Equal Opportunity Employer.  For California Applicants, please find our CRPA information here. 

Top Skills

Arm Templates
Azure Cloud Services
Databricks
Event Hubs
Kafka
Python
Sigma
Snowflake
SQL
Terraform

Similar Jobs

4 Days Ago
In-Office or Remote
Chicago, IL, USA
91K-111K Annually
Mid level
91K-111K Annually
Mid level
Fintech
The Data Engineer will develop scalable data platforms, manage ETL pipelines, optimize SQL queries, and document data insights in collaboration with a team.
Top Skills: GCPLookerNode.jsPythonSQLTerraformTypescript
6 Days Ago
Remote or Hybrid
Orlando, FL, USA
Senior level
Senior level
AdTech • Cloud • Digital Media • Information Technology • News + Entertainment • App development
As a Data Engineer II, you will design, build, and maintain data pipelines, troubleshoot production issues, and work in Agile teams to support Fandango's data needs.
Top Skills: Amazon MwaaApache AirflowAthenaAWSDynamoDBEmrGlueGlueInformaticaJavaLambdaPysparkPythonRedshiftS3ScalaSQLTalendTerraform
7 Days Ago
Remote or Hybrid
New York, NY, USA
150K-180K Annually
Senior level
150K-180K Annually
Senior level
AdTech • Cloud • Digital Media • Information Technology • News + Entertainment • App development
Lead design and development of SAP data solutions, ensuring data quality, integrity, and compliance while collaborating within agile teams.
Top Skills: AWSAzureDatabricksGCPPythonRest ApisSap Analytics CloudSap Business Data CloudSap BwSap Bw/4HanaSap DatasphereSap Hana

What you need to know about the Los Angeles Tech Scene

Los Angeles is a global leader in entertainment, so it’s no surprise that many of the biggest players in streaming, digital media and game development call the city home. But the city boasts plenty of non-entertainment innovation as well, with tech companies spanning verticals like AI, fintech, e-commerce and biotech. With major universities like Caltech, UCLA, USC and the nearby UC Irvine, the city has a steady supply of top-flight tech and engineering talent — not counting the graduates flocking to Los Angeles from across the world to enjoy its beaches, culture and year-round temperate climate.

Key Facts About Los Angeles Tech

  • Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
  • Key Industries: Artificial intelligence, adtech, media, software, game development
  • Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
  • Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account