Data Engineer

| Remote
Sorry, this job was removed at 1:42 p.m. (PST) on Thursday, April 8, 2021
Find out who’s hiring remotely
See all Remote jobs
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

ClickUp is looking for a passionate and dedicated data engineer to help us create and scale our analytics architecture. You’ll be joining the analytics team powering our business intelligence, data science, and machine learning projects to empower us to get 1% better everyday! ClickUp is the fastest growing productivity app, growing to 200,000+ teams and a $1 billion valuation in under 3 years as validated by our $100M series B. We’re the most recent Unicorn startup!

The analytics team is responsible for designing and supporting our AWS analytics infrastructure and dashboard integrations.  This includes data ingestion, ETL, storage, query caching and optimization. Our analytics infrastructure is currently in its early stages so you will have significant influence on its development.  An ideal candidate would be an SQL wizard and have experience designing and implementing high reliability and scalable analytics infrastructure on AWS and Snowflake.

Collaboration and teamwork are vital to how ClickUp operates. A significant portion of your responsibilities will include working closely with members of other teams to uncover their data requirements and implement high quality, maintainable pipeline updates to support their reporting needs.

We’re scaling quickly, so we’re recruiting teammates who share our core values, know how to get sh*t done, and would add a lot to our extremely driven culture.

The Role

  • Create and maintain optimal data pipeline architecture.
  • Assemble, clean, and transform complex data sets to support reporting and analytical tasks.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Airflow, SQL, Python, Snowflake, Fivetran, AWS.
  • Identifying and troubleshooting errors and performance issues that occur within our infrastructure.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our analytics data separated and secure across national boundaries through multiple data centers and AWS regions.
  • - Create data tools for analytics and data science team members that assist them in building and optimizing our product into an innovative industry leader.

Qualifications

  • 3+ years experience writing and maintaining production level databases and ETL pipelines.
  • Experience with pipeline orchestration and monitoring tools such as Airflow.
  • Build processes and documentation supporting data transformation, cataloging, data structures, metadata, dependency and workload management.
  • Expert knowledge of SQL and relational databases.
  • Expert knowledge of automation and Python.
  • Experience with Snowflake, AWS, or other cloud service big data services.
  • A strong self-starter, operationally-focused; a problem-solver.
  • Excellent interpersonal, written, and oral communication skills
Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Location

Our company is located in East Village. Close by to lots of restaurants and Petco Park! There is plenty of parking and public transportation options.

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about ClickUpFind similar jobs