Senior Data Engineer, Data Streaming

| Greater LA Area | Hybrid
Sorry, this job was removed at 9:05 a.m. (PST) on Thursday, June 3, 2021
Find out who's hiring in Greater LA Area.
See all Data + Analytics jobs in Greater LA Area
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

About The Brand

Pluto TV, a ViacomCBS company, is the leading free streaming television service in America, delivering 250+ live and original channels and thousands of on-demand movies in partnership with major TV networks, movie studios, publishers, and digital media companies. Pluto TV is available on all mobile, web and connected TV streaming devices and millions of viewers tune in each month to watch premium news, TV shows, movies, sports, lifestyle, and trending digital series. Headquartered in West Hollywood, Pluto TV has offices in New York, Silicon Valley, Chicago and Berlin.

Overview and Responsibilities

The Sr. Data Engineer for streaming will be responsible for supporting and enhancing our Kafka and Kinesis based streaming data collection platforms. The ideal candidate will have strong experience with Kafka (open-source or Confluent) and AWS Kinesis, as well as cloud platforms. The candidate should also be strong with AWS and Kafka components (Topics, Producer/Consumer, KStream, KTable, and KSQL), Kinesis components including streams, firehose and data analytics, and be experienced working with streaming systems that handle very high volume and velocity data. This will require a strong understanding of streaming architectures, and experience with AWS, CI/CD tools, and infrastructure as code.

 

Working as part of the Data Engineering team responsible for managing data feeds and collecting analytical streaming data from our client software. In this fast-paced environment, you will deal with high volume and high velocity data, and demonstrate your strong analytical, problem solving, organizational, and prioritization skills. This important role has a wide range of responsibilities, including:

 

This is a critical role with a wide range of responsibilities, including:

  • Demonstrated expertise with streaming data
  • Deep level understanding of Kafka
  • Assume ownership and ability to lead simultaneous projects
  • Strong time-management, prioritization, and organizational skills
  • Problem-solving and investigation skills
  • Strong interpersonal skills, with the ability to cultivate relationships and negotiate with internal clients
  • Ability to meet deadlines and partner timelines
  • Strong written and oral communication skills
  • Bachelor's Degree or Equivalent

Basic Qualifications

We believe the right individual will have the following skills and experience in order to be successful in this role:

  • Java, Python or Scala
  • Kafka (Topics, Producer/Consumer, KStream, KTable, KSQL)
  • AWS Kinesis (streams, firehose and data analytics)
  • SQL and familiarity with modern cloud databases like Snowflake
  • Frameworks: Spring Framework, Flink, JUnit, Mockito, Apache Flink.
  • Cloud Technologies: AWS (VPC, Kubernetes, EC2, Lambda, CloudWatch, etc)
  • Deployment Tools: Jenkins, Docker, Helm, Terraform
  • Monitoring tools: Grafana, Prometheus
  • CI/CD tools
  • Strong analytical and troubleshooting skills
  • At least 7 years of hands-on experience with data streaming systems and 10 or more years of data engineering
  • Effective and clear communication of open project risks and team dependencies
Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Location

750 N. San Vicente Blvd. , Los Angeles, CA 90069

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about Pluto TVFind similar jobs