Company overview:
Blue Orange Digital is a cloud-based data transformation and predictive analytics development firm with offices in NYC and Washington, DC. From startups to Fortune 500’s, we help companies make sense of their business challenges by applying modern data analytics techniques, visualizations, and AI/ML. Founded by engineers, we love passionate technologists and data analysts. Our startup DNA means everyone on the team makes a direct contribution to the growth of the company.
Note: This is a long-term contractor opportunity, structured as an independent contractor role, with potential to transition into a full-time position later. The client requires a contractor arrangement at this stage. We cannot consider candidates represented by third-party agencies unless applying directly.
Position overview:
Blue Orange is looking for an exceptional Sr. Data & DevOps Engineer to join our talented multi-disciplinary team and work with a heavy impact enterprise client within the global supply chain and logistics space.
You are a multidisciplinary hands-on, action-oriented data and devops engineer, adept with AWS CDK in python and AWS services. You are equally fluent in SQL variants as you are in python, spark and standard data libraries as well as other languages like javascript. You consider scalable microservice architectures, APIs, tiering, SaaS, IaaS, IoC, etc. as table stakes. You are at home in AWS across a range of managed services – data and software – from RDS and Aurora to Atlas, from VMs to containers to Lambdas, CI/CD, devops, Terraform, CloudFormation, opentofu, git, etc.
You like to move fast and break things, and then fix them even faster!
You learn quickly by collaborating with colleagues tactically and working hands-on to master things rapidly. You like a good challenge and enjoy solving data and code puzzles every day to advance your Team. You thrive in a fast-paced environment where you can make a difference contributing to a Team and advance the Business it serves every day.
If this sounds like you, we have an opportunity that we would like to discuss with you.
Responsibilities:
- Drive innovation and data pipelines and AWS services using AWS CDK in python
- Work fast with client experts and stakeholders to learn the existing data flows, code bases, infrastructure, operations, log methods, etc.
- Build, maintain, and data ingestions, data models, orchestrations, transformations and validation tests
- Quickly master data flows and sets and code to rapidly position yourself to begin enhancing and advancing the platform as a whole, both functionally and non-functionally.
- Quickly achieve information dominance and operational prowess across all microservices, code bases and data flows supporting the platform in AWS.
- Evolve the data architecture in collaboration with the existing Team to take on adjacent platform missions and volumes.
- Be and stay professionally and aggressively curious about the platform with colleagues, the code and its data.
- Work in an extreme delivery Agile mode to constantly deliver value for our clients.
Requirements:
- Strong working command of AWS CDK and AWS services.
- At least 7 years experience building and supporting data platforms; exposure to data technologies, eg. RDS, DynamoDB, Redshift, EMR, Glue, kafka, kinesis, MSK, Data Pipeline, Lake Formation, dbt, Airflow, Spark, etc.
- Experience with AWS and exposure to other cloud data platforms, like ADF, Azure Fabric, Snowflake, Databricks, etc.
- Advanced level Python, SQL, and Bash scripting.
- Experience designing and building robust CI/CD pipelines.
- Comfortable with Docker, configuration management, and monitoring tools.
- Knowledge of best practices related to security, performance, and disaster recovery.
- Excellent verbal and written English communication.
- BA/BS degree in Computer Science or a related technical field, or equivalent practical experience.
- Interacts with others using sound judgment, good humor, and consistent fairness in a fast-paced environment.
- The ability to maintain poise, efficiency and effectiveness in fast-paced, sometimes frenetic, high-stakes environments
Preferred qualifications:
- 12+ years of experience in a data engineering role, with experience in ETL, data warehousing, data lakes, lakehouses, pipelines, modeling, and data quality validation
- Expert experience with data ingestion, modeling and conformance/compliance validation
- Expert-level skills with SQL, statistical analysis and data validation
- Experience with GCP, Azure, Snowflake, Oracle, etc.
- BA or BS degree in a technical or quantitative field (e.g.: computer science, statistics)
- Excellent verbal and written English communication.
- Experience in the logistic services industry a plus
- R, Python, Scala, SPSS, Teradata, SAS, PowerBI, Tableau, Looker pluses
- Certifications in AWS, Azure DevOps, Azure Data Fundamentals, Databricks, Snowflake
Salary: $150,480 - $170,544 per year ($12,540 to $14,212 per month) - USD
Blue Orange Digital is an equal opportunity employer.
Background checks may be required for certain positions/projects.
Top Skills
Similar Jobs
What you need to know about the Los Angeles Tech Scene
Key Facts About Los Angeles Tech
- Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
- Key Industries: Artificial intelligence, adtech, media, software, game development
- Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
- Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering