Varsity Brands Logo

Varsity Brands

Senior Data Engineer

Posted 2 Hours Ago
Hybrid
4 Locations
110K-130K Annually
Mid level
Hybrid
4 Locations
110K-130K Annually
Mid level
The Senior Data Engineer will architect and implement data pipelines, manage their performance, and partner with stakeholders to ensure data accessibility and quality. Responsibilities include assessing data sourcing needs and providing visibility into pipeline status.
The summary above was generated by AI



 

WORK TYPE: Hybrid, applicants can be located in Texas, Indiana, Tennessee and Kansas

 

WORK HOURS:  Monday – Friday; 8am-5pm CST

 

Applicants must be authorized to work in the U.S. without sponsorship. No sponsorship offered for this role. 

 

TRAVEL REQUIREMENT: Less than 5%

 

BASE PAY RATE:    $110,000 - $130,000

The base salary will vary based on criteria such as education, experience and qualifications of the applicant, location, internal equity, and alignment with the market.  

 

 

HOW YOU WILL MAKE AN IMPACT

The Senior Data Engineer (Sourcing & Pipeline Management) will play an integral role within the growing Varsity Brands Data Center of Excellence team.

 

The role’s key elements are:

  • Architect + Implement: Design, build and launch efficient and reliable data pipelines to move data from source platforms, including front-end applications, back-end systems, and third-party analytics and data services, to our enterprise data hub.  In addition, design and build pipelines to supply downstream enterprise applications with prepared reference data from our enterprise data hub.
  • Orchestrate + Monitor: Manage data pipelines as an interdependent network, with proactive visibility into pipeline errors as well as costs over time.
  • Partner + Educate: Partner with stakeholders to understand business requirements, work with cross-functional data and products teams and build efficient and scalable data solutions. Use your data and analytics experience to identify gaps and propose improvements in existing systems and processes, as well as making your source data pipelines easily accessible to data stakeholders

 

 

WHAT YOU WILL DO  

  • Working with data modelers and analysts to identify and prioritize data sourcing gaps.
  • Assessing best fit tool for any given data source.
  • Establishing pipeline cadences and timing based on analytics needs and use cases while being cost conscious.
  • Providing downstream data stakeholders visibility to pipeline scheduling and status.
  • Responsively troubleshooting errors or alerts in existing pipelines.
  • Tracking and summarizing current period pipeline costs and trends for business and IT stakeholders.

 

 

QUALIFICATIONS

KNOWLEDGE/ SKILLS/ ABILITIES

  • Familiarity with modern data stack tools and services employed to replicate data from source systems to cloud data warehouses or lakes particularly using Snowflake  and solid understanding of when and where a given tool is appropriate.
  • Experience utilizing data replication tools and services like HVR, Fivetran, Airbyte, Meltano, & Matillion a MUST
  • Proficiency writing custom code to source data from APIs when needed.
  • Ability to work collaboratively with product or application owners to tease out relevant raw data is available to source
  • Ability to identify source system data capture opportunities to unlock analytics capabilities.
  • Strong knowledge in data architecture, data modeling, schema design, and software development principles.

 

 

EDUCATION/ EXPERIENCE

  • 3+ years of experience in the data engineering/warehousing space, custom ELT/ETL design, implementation, and maintenance.
  • 3+ years of experience writing SQL in an analytics or data pipeline context.
  • 2+ years of experience in at least one language (Python, Scala, Java, etc) in a data engineering or analytics context.
  • 1+ year of experience using an orchestration tool or service to coordinate ELT and downstream analytics pipelines.
  • Experience using REST APIs to acquire and flow data from source to target systems.
  • Experience working with cloud data analytics platforms and tools, particularly Snowflake, dbt, Tableau and Power BI a MUST
  • Experience standing up data pipelines from SAP ERP is a plus.
  • Experience standing up data pipelines from Google Analytics 4 data is a plus.

 

 

PHYSICAL REQUIREMENTS

This job operates in a professional office environment. Largely a sedentary role with some filing requiring the ability to lift files, open filing cabinets and bending or standing on a stool as necessary. The ability to sit or stand for long periods through meetings and while operating office equipment, PC’s, laptop, telephone will be required.  


 

Top Skills

Airbyte
Dbt
Fivetran
Hvr
Java
Matillion
Meltano
Power BI
Python
Scala
Snowflake
SQL
Tableau

Similar Jobs

2 Days Ago
In-Office
Dallas, TX, USA
148K-195K Annually
Mid level
148K-195K Annually
Mid level
Blockchain • Fintech • Payments • Financial Services • Cryptocurrency • Web3
As a Data Engineer, you'll manage data pipelines for blockchain analytics, build data models, enhance warehouse performance, and ensure data integrity.
Top Skills: AirflowAWSAzureBigQueryDagsterDatabricksDbtGCPJavaPythonScalaSnowflakeSQL
2 Days Ago
In-Office
Austin, TX, USA
148K-195K Annually
Mid level
148K-195K Annually
Mid level
Blockchain • Fintech • Payments • Financial Services • Cryptocurrency • Web3
As a Data Engineer, you will build and maintain data warehouses and pipelines for blockchain analytics, ensuring data accuracy and quality to support business functions and insights.
Top Skills: AirflowAWSAzureBigQueryDagsterDatabricksDbtGCPJavaPythonScalaSnowflakeSQL
2 Days Ago
In-Office
Houston, TX, USA
148K-195K Annually
Mid level
148K-195K Annually
Mid level
Blockchain • Fintech • Payments • Financial Services • Cryptocurrency • Web3
The Data Engineer will own the data warehouse and pipelines for blockchain analytics, collaborating with others to build data models and maintain data quality. Responsibilities include ELT pipeline development, data monitoring, and leveraging AI for process improvements.
Top Skills: AirflowAWSAzureBigQueryDagsterDatabricksDbtGCPJavaPythonScalaSnowflakeSQL

What you need to know about the Los Angeles Tech Scene

Los Angeles is a global leader in entertainment, so it’s no surprise that many of the biggest players in streaming, digital media and game development call the city home. But the city boasts plenty of non-entertainment innovation as well, with tech companies spanning verticals like AI, fintech, e-commerce and biotech. With major universities like Caltech, UCLA, USC and the nearby UC Irvine, the city has a steady supply of top-flight tech and engineering talent — not counting the graduates flocking to Los Angeles from across the world to enjoy its beaches, culture and year-round temperate climate.

Key Facts About Los Angeles Tech

  • Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
  • Key Industries: Artificial intelligence, adtech, media, software, game development
  • Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
  • Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account