Sr. Data Engineer

Sorry, this job was removed at 2:40 p.m. (PST) on Tuesday, January 7, 2025
Find out who’s hiring remotely
See all Remote jobs
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

 Sr. Data Engineers – 18 month project, 100% Remote
 

Will be reporting to the Mgr. Acquisition Resolutions this team is focused on acquisition piece which includes collecting data into the tool which consolidates all channels. Writes scripts out of Datorama- they apply the business logic, they build dimensional models, write ETL scripts and copies data into snowflake environment. 

Candidate needs: skills Snowflake DW open to other DWs.  must good w SQL – must have good understanding of Data – SQL queries – spark basic spark coding, Airflow someone who has some exposure. Python-nice to have – understands how data is consumed. Data modeling, Databricks, Airflow, Spark.

Expectations – tight time line must deliver project by July. Must be able to work independently. Focus on deliverables. Sr. level candidate – must be able to collaborate with the team, self- starter, proactive.

 Data Engineer who will partner with business, analytics and engineering teams to
design, build and maintain ease for use data structures to facilitate reporting and monitoring key
performance indicators. Collaborating across disciplines, you will identify internal/external data sources
to design table structure, define ETL strategy & automated QA checks and implement scalable ETL
solutions.

Responsibilities:
*Partner with technical and non-technical colleagues to understand data and reporting
requirements.
* Work with Engineering teams to collect required data from internal and external systems.
* Design table structures and define ETL strategy to build performant Data solutions that are
reliable and scalable in a fast-growing data ecosystem.
* Develop Data Quality checks for source and target data sets. Develop UAT plans and conduct
QA.
* Develop and maintain ETL routines using ETL and orchestration tools such as Airflow, Luigi and
Jenkins.
* Document and publish Metadata and table designs to facilitate data adoption.
* Perform ad hoc analysis as necessary.
* Perform SQL and ETL tuning as necessary.
* Create runbooks and actionable alerts as part of the development process

Basic Qualifications:

* Strong understanding of data modeling principles including Dimensional modeling, data normalization principles etc.
* Experience using analytic SQL, working with traditional relational databases and/or
distributed systems such as Hadoop / Hive, BigQuery, Redshift.
* Experience programming languages (e.g. Python, R, bash) preferred.
* Experience with workflow management tools (Airflow, Oozie, Azkaban, UC4)
* Good understanding of SQL Engines and able to conduct advanced performance tuning
* Experience with Hadoop (or similar) Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark,
Presto, Pig, HBase)
* Familiarity with data exploration / data visualization tools like Tableau, Looker, Chartio, etc.
* Ability to think strategically, analyze and interpret market and consumer information.
* Strong communication skills – written and verbal presentations.
* Excellent conceptual and analytical reasoning competencies.
* Degree in an analytical field such as economics, mathematics, or computer science is desired.
* Comfortable working in a fast-paced and highly collaborative environment.
* Process oriented with great documentation skills

Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Location

101 S. 1st. St , Burbank, CA 91502

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about Foothills Consulting GroupFind similar jobs