Sr. Data Engineer
Discover It Here.
At Nordstromrack.com and HauteLook, we strive to empower shoppers through choice and discovery of the hottest fashion at great prices. At the intersection of technology, fashion and design, we value employees who have great in-“sites” to fashion and e-commerce, act fast, think creatively and embody our customer-first mentality. Our fast-paced, dynamic culture attracts creative, passionate individuals with a determined, can-do attitude and entrepreneurial spirit. We work hard and play hard in a fun, casual and collaborative work environment in the heart of Downtown LA.
Summary:
Data Engineering is a nimble team of engineers, sitting at the hub of technology and the business. Using the latest open-source tech, we supply the company with the information, tools, and training to make smarter use of our vast datasets. We’re looking for a Data Engineer with the software chops to build not only the pipelines to move data between systems, but also the next generation of tools to enable us to take full advantage of that data. In this role, your work will broadly influence the company's data consumers, from analysts to top executives.
A day in the life…
- Partner directly with business stakeholders from beginning to end of their data projects – understand the context and goals, find and collect the data needed to meet those goals, and help them visualize it and tell the story
- Migrate existing batch jobs to Spark jobs
- Build data pipelines orchestration
- Create the design and architecture for data-lake, data-marts, data-models, and data-warehouse
- Ensure efficiency of data science workflows and advanced machine learning algorithms.
- Contribute to open source solutions and communities
- Stay current on emerging tools and technologies
- Collaborate cross-functionally with other software engineers and their teams
- Establish and demonstrate technologies, solutions, and leading practices
- Balance resources, requirements, and complexity
- Provide occasional on-call production support after hours
You own this if you have…
- 5+ years programming experience in Python, R & Shell Scripting
- 3+ years experience with Spark/PySpark or equivalent distributed processing systems
- Proven ability to look at solutions unconventionally, explore opportunities, and devise innovative solutions
- Workflow management tools (Airflow etc.)
- Strong SQL and all-around data know-how
- 2+ years of experience with serverless compute platforms (Lambda, Cloud Functions, etc.)
- 2+ years of experience with Streaming platforms like Kafka or Kinesis
- 3+ years of experience using AWS or Google Cloud Platform for data applications (EC2, S3, Redshift, Data Pipeline, EMR, Glue, Athena, DynamoDB, ECS, BigQuery, etc.)
- Familiar with CI/CD workflows (CircleCI)
- Experience building infrastructure as code (Terraform, Ansible)
- BI tools (e.g. Tableau, Looker, MicroStrategy)
- Familiarity with container orchestration services (ECS, Kubernetes, etc.)
- Data Lake experience is a huge plus!
- Comfortable working in Linux environment
- Excellent communication skills (verbal and written) and interpersonal skills and an ability to effectively communicate with both business and technical teams
- Experience in an e-commerce environment