The Data Engineering team at GoGuardian is home to one of the most valuable educational technology data stores in the world that powers the student learning experience across some of the largest school districts in the world. The team comprises of a highly diverse mix of passionate, data savvy engineers working across some of the world’s leading cutting edge technologies. GoGuardian is a preferred client in the Southern California region to the largest cloud computing platform primarily due to the maturity of our data infrastructure stack and the tremendous growth our business has generated consistently on an annual basis.
The Data Engineering team is seeking a highly motivated and passionate data architect with a focused background in designing Big Data, Business Intelligence (BI) systems - preferably in diverse on-prem, cloud based platforms. You will have an opportunity to be part of a highly dynamic, creative team of data engineers who are obsessed with student education and create world-class solutions to solve problems at scale! A Data Architect that is curious and passionate about data and owning data deliverables such as models, glossaries and dictionaries and can govern the data.
- 8+ years of experience in the data space spanning data modeling, integration, Extract-Transform-Load (ETL) processing and data pipeline design and implementation.
- Extensive database related experience with SQL, relational databases, and NoSQL database like mongoDB/Elasticsearch.
- Extensive experience with AWS data management (S3, Redshift) and related tools (Athena, EMR, Kinesis to name a few) in an enterprise environment.
- Experience with service oriented architecture and API design highly preferred.
- Create data models at all levels including conceptual, logical, and physical for both relational and dimensional solutions.
- Work with business stakeholders to identify critical data entities and attributes and capture how data is interpreted by users in various parts of the organization.
- Collaborate within an agile, multi-disciplinary team of engineers, product, data analysts, UX designers, QA and scrum master to solve customer problems effectively.
- Identify data quality metrics and guide processes that track and measure data quality continually.
- Develop standards and design solutions for data acquisition, integration and transformation that are scalable and reliable.
- Partner with a cross-functional team of external cloud providers and company business stakeholders to optimize our cloud infrastructure usage and related storage costs.
- Work with engineering team and related business stakeholders to develop a strategic operational plan for our data infrastructure with specific focus on data quality and fault tolerance across data platform.
What you'll need
- Bachelors/Masters in Computer Science or related field of study. Strong experience with data tools integration and related processes.
- Experience prioritizing and executing data management initiatives across a diverse group of engineering and business stakeholders in a highly agile scrum based environment.
- Extensive experience in data delivery across highly distributed systems using streaming technologies (Kafka, Kinesis, Flume to name a few).
- Strong problem-solving skills to build robust, scalable and maintainable technical solutions at scale.
- Help identify and design solutions that allow performing root cause analysis to enable proactive issue resolution and data quality maintenance.
- Practical experience in application of machine learning, natural language, and statistical analysis is a plus.
- Help achieve executive buy-in for data management initiatives across the company.
Read Full Job Description