Dick's Sporting Goods Logo

Dick's Sporting Goods

Senior Data Engineer (REMOTE)

Posted 22 Days Ago
Remote
Hiring Remotely in United States
83K-138K Annually
Senior level
Remote
Hiring Remotely in United States
83K-138K Annually
Senior level
As a Senior Data Engineer, you will design and support data warehouse schemas, develop ETL processes, collaborate with teams for quality data insights, and ensure operational excellence in data management.
The summary above was generated by AI

At DICK’S Sporting Goods, we believe in how positively sports can change lives. On our team, everyone plays a critical role in creating confidence and excitement by personally equipping all athletes to achieve their dreams.  We are committed to creating an inclusive and diverse workforce, reflecting the communities we serve.

If you are ready to make a difference as part of the world’s greatest sports team, apply to join our team today!

OVERVIEW:

We are creating the future of sports, driven by powerful data products and platforms that serve our Athletes and Teammates.

We are looking for a Senior Data Engineer to join our passionate team, adding your background and experience to make us even stronger. In this role, you will build dataset and make it accessible to our partner teams by writing great code to simplify the complexity and ensure quality. Your work will enable product teams, data scientists, and decision-makers across the company to bring together insights and inform our business.

We believe that trusted, easy to consume data is critical and as a Senior Data Engineer your work will help to build that foundation.

You will also be responsible for the daily operations inclusive of troubleshooting and job monitoring.  You will be a part of the growing Data team reporting to the Sr. Director, Data Analytics.

 The impact you will have:

Design/Strategy: You will design and support the business’s database and table schemas for new and existing data sources for the data warehouse. Creates and supports the ETL to facilitate data accommodation into the warehouse. In this capacity, the Data Engineer designs and develops systems for the maintenance of the business’s data warehouse, ETL processes, and business intelligence.

Collaboration: You will be collaborative - working closely with analysts, data scientists, and other data consumers within the business to gather and deliver high quality data for business cases. The Data Engineer also works closely with other disciplines/departments and teams across the business in coming up with simple, functional, and elegant solutions that balance data needs across the business

Analytics: You will play an analytical role in quickly and thoroughly analyzing business requirements and subsequently translating the emanating results into good technical data designs. In this capacity, the Data Engineer establishes the documentation of the data solutions, develops, and maintains technical specification documentation for all reports and processes.

What You Will Do

  • You’ll be working with a variety of internal teams -- Engineering, Business -- to help them solve their data needs

  • Your work will provide teams with visibility into how DICKs products are being used and how we can better serve our customers

  • Identify data needs for business and product teams, understand their specific requirements for metrics and analysis, and build efficient and scalable data pipelines to enable data-driven decisions across DICKs.

  • Experience in one or more of the following: Python (Preferred), Scala, C++, or Java.

  • Design, develop, reliable data models and extremely efficient pipelines to build quality data and provide intuitive analytics to our partner teams.

  • Help the Data Analytics & Data Science team apply and generalize statistical and econometric models on large datasets

  • Drive the collection of new data and the refinement of existing data sources, develop relationships with production engineering teams to manage our data structures as the DICKs product evolves

  • Develop strong subject matter expertise and manage the SLAs for those data pipelines

  • Participate in design sessions and code reviews to elevate the quality of data engineering across the organization.

  • Participate in an on-call rotation for support during and after business hours.

  • Lead design sessions and code reviews to elevate the quality of data engineering across the organization.

Technical Skills

  • Expert in SQL and/or SQL based languages and performance tuning of SQL queries

  • Strong understanding of Normalized/Dimensional model disciplines and similar data warehousing techniques.

  • Experience in one or more of the programming languages are required: Python (Preferred), Scala, C++, or Java, Go, Kotlin.

  • Strong Experience with cloud-based data warehouses – e.g., Snowflake, Big Query, Synapse, RedShift, etc.

  • Experienced with ETL/ELT in Databricks, with Medallion architecture and with Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT).

  • Experience with CI/CD on Databricks using tools such as GitHub Actions, and Databricks CLI.

  • Strong Grasp of data management principles: Data Lake, Data Mesh, Data Catalog, Data Quality, etc.

QUALIFICATIONS:

  • 5+ years of experience in Data Warehousing and development using data technologies such as Relational & NoSQL databases, open data formats, building data pipelines (ETL and ELT) with batch or streaming ingestion, loading and transforming data. 

  • Expert in SQL and/or SQL based languages and performance tuning of SQL queries 

  • Strong understanding of Normalized/Dimensional model disciplines and similar data warehousing techniques. 

  • Experience in one or more of the programming languages are required: Python (Preferred), Scala, C++, or Java, Go, Kotlin. 

  • Strong experience working with ETL/ELT concepts of data integration, consolidation, enrichment, and aggregation in petabyte scale data sets. 

  • Experience with at least one of the following cloud platforms: Microsoft Azure (Preferred), Amazon Web Services (AWS), or Google Cloud Platform (GCP) 

  • Strong Experience with cloud-based data warehouses – e.g., Snowflake, Big Query, Synapse, RedShift, etc. 

  • Experience with message queuing, stream processing (Kafka, Flink, Spark Streams) 

  • Strong Grasp of data management principles: Data Lake, Data Mesh, Data Catalog, Master Data, Data Quality, etc. 

  • Experience in BI tooling such as Qlik, MicroStrategy, Tableau, PowerBI or Looker 

  • Experience with orchestration tools (Control-M, Airflow etc.) 

  • Strong communication skills across different mediums to craft compelling messages to drive action and alignment. 

  • Comfort with agile delivery methodologies in a fast-paced complex environment – Scrum, SAFe, utilizing tools such as Jira, Confluence, and GitHub 

  • Ideal candidates will have experience working with one of the following industries: Retail, Supply Chain, Logistics, Manufacturing or Marketing 

  • Proficient in Linux/Unix environments  

#LI-FD1

Targeted Pay Range: $83,000.00 - $138,200.00. This is part of a competitive total rewards package that could include other components such as: incentive, equity and benefits. Individual pay is determined by a number of factors including experience, location, internal pay equity, and other relevant business considerations. We review all teammate pay regularly to ensure competitive and equitable pay.DICK'S Sporting Goods complies with all state paid leave requirements. We also offer a generous suite of benefits. To learn more, visit www.benefityourliferesources.com.

Top Skills

Airflow
Big Query
C++
Confluence
Control-M
Databricks
Flink
Git
Go
Java
JIRA
Kafka
Kotlin
Looker
Microstrategy
Power BI
Python
Qlik
Redshift
Scala
Snowflake
Spark
SQL
Synapse
Tableau

Similar Jobs

Yesterday
Remote
Hybrid
38 Locations
110K-180K Annually
Mid level
110K-180K Annually
Mid level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
The role involves ensuring the reliability of data systems through building ETL processes, automated quality checks, and managing data integrity and reporting.
Top Skills: Apache AirflowSparkApache SupersetETLJIRAMySQLNumpyPandasPostgresPythonSQLSqlalchemyTableau
2 Days Ago
Remote
Chicago, IL, USA
106K-134K Annually
Senior level
106K-134K Annually
Senior level
Fintech
The Senior Data Engineer will lead the development of scalable data platforms, design ETL workflows, mentor junior engineers, and ensure data quality and governance.
Top Skills: GCPNode.jsPythonSQLTerraformTypescript
12 Days Ago
Easy Apply
Remote
Hybrid
3 Locations
Easy Apply
130K-170K
Senior level
130K-170K
Senior level
AdTech • Big Data • Information Technology • Marketing Tech • Sales • Software
As a Senior Data Engineer, you will develop and maintain ETL pipelines, design scalable systems, troubleshoot production issues, and mentor team members while ensuring best practices are followed.
Top Skills: AirflowBigQueryDataflowGoogle Cloud PlatformKubernetesPub/SubPythonSQL

What you need to know about the Los Angeles Tech Scene

Los Angeles is a global leader in entertainment, so it’s no surprise that many of the biggest players in streaming, digital media and game development call the city home. But the city boasts plenty of non-entertainment innovation as well, with tech companies spanning verticals like AI, fintech, e-commerce and biotech. With major universities like Caltech, UCLA, USC and the nearby UC Irvine, the city has a steady supply of top-flight tech and engineering talent — not counting the graduates flocking to Los Angeles from across the world to enjoy its beaches, culture and year-round temperate climate.

Key Facts About Los Angeles Tech

  • Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
  • Key Industries: Artificial intelligence, adtech, media, software, game development
  • Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
  • Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account