Tiger Analytics Logo

Tiger Analytics

Principal Data Engineer (Azure)

Reposted 24 Days Ago
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Toronto, ON
Senior level
In-Office or Remote
Hiring Remotely in Toronto, ON
Senior level
The Principal Data Engineer (Azure) role involves designing, building, and maintaining scalable data pipelines, collaborating with various teams, and utilizing Azure technologies for data processing and analytics.
The summary above was generated by AI
Description

Tiger Analytics is a global AI and analytics consulting firm. With data and technology at the core of our solutions, we are solving problems that eventually impact the lives of millions globally. Our culture is modeled around expertise and respect with a team-first mindset. Headquartered in Silicon Valley, you’ll find our delivery centers across the globe and offices in multiple cities across India, the US, UK, Canada, and Singapore, including a
substantial remote global workforce.

We’re Great Place to Work-Certified™. Working at Tiger Analytics, you’ll be at the heart of an AI revolution. You’ll work with teams that push the boundaries of what is possible and build solutions that energize and inspire.

Requirements

Curious about the role? What your typical day would look like?

As a Principal Data Engineer (Azure), you would have hands on experience working on Azure as cloud, Databricks and some exposure/experience on Data Modelling. You will build and learn about a variety of analytics solutions & platforms, data lakes, modern data platforms, data fabric solutions, etc. using different Open Source, Big Data, and Cloud technologies on Microsoft Azure.

● Design and build scalable & metadata-driven data ingestion pipelines (For Batch and Streaming Datasets)

● Conceptualize and execute high-performance data processing for structured and unstructured data, and data
harmonization

● Schedule, orchestrate, and validate pipelines

● Design exception handling and log monitoring for debugging

● Ideate with your peers to make tech stack and tools-related decisions

● Interact and collaborate with multiple teams (Consulting/Data Science & App Dev) and various stakeholders to meet deadlines, to bring Analytical Solutions to life.

What do we expect?

● Experience in implementing Data Lake with technologies like Azure Data Factory (ADF), PySpark, Databricks, ADLS,

Azure SQL Database

● A comprehensive foundation with working knowledge of Azure Synapse Analytics, Event Hub & Streaming
Analytics, Cosmos DB, and Purview

● A passion for writing high-quality code and the code should be modular, scalable, and free of bugs (debugging
skills in SQL, Python, or Scala/Java).

● Enthuse to collaborate with various stakeholders across the organization and take complete ownership of
deliverables.

● Experience in using big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, Elastic Search

● Adept understanding of different file formats like Delta Lake, Avro, Parquet, JSON, and CSV

● Good knowledge of building and designing REST APIs with real-time experience working on Data Lake or
Lakehouse projects.

● Experience in supporting BI and Data Science teams in consuming the data in a secure and governed manner

● Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE) are
valuable addition.

Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry.

Job Requirement

  • Mandatory: Azure Data Factory (ADF), PySpark, Databricks, ADLS, Azure SQL Database
  • Optional: Azure Synapse Analytics, Event Hub & Streaming Analytics, Cosmos DB and Purview.
  • Strong programming, unit testing & debugging skills in SQL, Python or Scala/Java.
  • Some experience of using big data technologies like Hadoop, Spark, Airflow, NiFi, Kafka, Hive, Neo4J, Elastic
    Search.
  • Good Understanding of different file formats like Delta Lake, Avro, Parquet, JSON and CSV.
  • Experience of working in Agile projects and following DevOps processes with technologies like Git, Jenkins & Azure DevOps.
  • Good to have:
  • Experience of working on Data Lake & Lakehouse projects
  • Experience of building REST services and implementing service-oriented architectures.
  • Experience of supporting BI and Data Science teams in consuming the data in a secure and governed manner.
  • Certifications like Data Engineering on Microsoft Azure (DP-203) or Databricks Certified Developer (DE)
Benefits

This position offers an excellent opportunity for significant career development in a fast-growing and challenging entrepreneurial environment with a high degree of individual responsibility.

Top Skills

Adls
Airflow
Azure Data Factory
Azure Sql Database
Azure Synapse Analytics
Cosmos Db
Databricks
Elastic Search
Event Hub
Hadoop
Hive
Kafka
Neo4J
Nifi
Purview
Pyspark
Spark
Streaming Analytics

Similar Jobs

3 Hours Ago
Remote
Canada
134K-182K Annually
Junior
134K-182K Annually
Junior
Artificial Intelligence • Cloud • Consumer Web • Productivity • Software • App development • Data Privacy
As a Frontend Product Software Engineer, you'll develop UI components for the design system, collaborate with designers, and ensure accessibility compliance while maintaining high code quality.
Top Skills: AngularCSSFigmaHTMLJavaScriptReactVue
3 Hours Ago
Remote
Canada
148K-200K Annually
Senior level
148K-200K Annually
Senior level
Artificial Intelligence • Cloud • Consumer Web • Productivity • Software • App development • Data Privacy
The Strategy Manager drives the execution of company strategies, ensuring alignment across functions and translating data into actionable insights for senior leaders.
Top Skills: Bi ToolsExcelSQL
3 Hours Ago
In-Office or Remote
7 Locations
161K-284K Annually
Senior level
161K-284K Annually
Senior level
Blockchain • eCommerce • Fintech • Payments • Software • Financial Services • Cryptocurrency
Ensure the hardware and embedded software for mining products work as intended, conduct analysis, define test cases, and create automation tools.
Top Skills: CEmbedded LinuxGdbGoI2CJtagPythonRustSpiSwdUartUsb

What you need to know about the Los Angeles Tech Scene

Los Angeles is a global leader in entertainment, so it’s no surprise that many of the biggest players in streaming, digital media and game development call the city home. But the city boasts plenty of non-entertainment innovation as well, with tech companies spanning verticals like AI, fintech, e-commerce and biotech. With major universities like Caltech, UCLA, USC and the nearby UC Irvine, the city has a steady supply of top-flight tech and engineering talent — not counting the graduates flocking to Los Angeles from across the world to enjoy its beaches, culture and year-round temperate climate.

Key Facts About Los Angeles Tech

  • Number of Tech Workers: 375,800; 5.5% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Snap, Netflix, SpaceX, Disney, Google
  • Key Industries: Artificial intelligence, adtech, media, software, game development
  • Funding Landscape: $11.6 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Strong Ventures, Fifth Wall, Upfront Ventures, Mucker Capital, Kittyhawk Ventures
  • Research Centers and Universities: California Institute of Technology, UCLA, University of Southern California, UC Irvine, Pepperdine, California Institute for Immunology and Immunotherapy, Center for Quantum Science and Engineering

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account