Be responsible for designing and creating data warehouse schemas related to extraction, transformation and load of data functions.
Be responsible for the whole testing/deploying process to make sure other services are not disrupted.
Be able to read, analyze, and digest data in the sense of business accomplishment, and design the optimal ETL process around those goals.
Sanitize and transform data if necessary.
Be responsible for generating business reports and ensure data validity. Maintain all reporting process company-wide to meet business requirements.
Work closely with other developers to provide any DB administration support, and take responsibilities of database instances management and scheduled jobs maintenance.
Develop internal tools that enhance development and testing efficiency, especially in terms of data validation/integration.
Monitor and tune database instances to ensure peak performance; design and implement backup and recovery solutions.
Responding to questions of data integrity and providing explanations of all data elements and calculations presented in the reporting tools.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS "big data" technologies.
Build analytic tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Build internal and external reporting infrastructure that can automate Excel reports or an API endpoint.
Create data tools and database tables for analytics and data science team members that assist them in building and optimizing our product into an innovative industry leader.
Write programs to automate various internal billing/accounting infrastructures.