Senior Software Engineer, Quantum

Comcast Denver, CO

About the Job

Business Unit:

As a Software Engineer for Quantum, a member of the Comcast's Video Insights and Analysis (VIA) group, you will have the opportunity to collaborate closely with video engineers, business developers, and data scientists to design, develop, and evolve cutting edge streaming data pipelines.

Leveraging both open source and commercial frameworks, you will influence the design and development of innovative data products that process high-volume, high-velocity streams to enable real-time analytics and data driven decisions that help us improve the quality of Comcast's IP video services.

Quantum designs data engineering solutions for the analytics pipelines behind Comcast's IP video on-demand and IP live streams delivery systems which include content acquisition, distribution, site reliability, and content playback.

What are some interesting problems you'll be working on?

Our systems currently process billions of events per day and data volume is growing rapidly. You will play a crucial role in designing and implementing state of the art technologies that are scalable and reliable You will work with Apache Kafka, Apache Spark and the AWS ecosystem. Continue to evolve data products and apply new industry standards in a fast-changing ecosystem of big data and open source technologies to both the real-time and at-rest data pipelines.

Where can you make an impact?

The VIDEO organization is building the core components needed to drive the next generation of television. Running this infrastructure, identifying trouble spots, and optimizing the overall user experience is a challenge that can only be met with a robust big data architecture capable of providing insights that would otherwise be drowned in a sea of data.

Success in this role is best enabled by a broad mix of skills and interests ranging from distributed systems software engineering prowess to the multidisciplinary field of data analysis/data engineering.

Core Responsibilities

  • Collaborate with VIDEO organization teams to develop a common data platform
  • Analyze massive amounts of data and drive prototype ideas for new tools and data products
  • Develop solutions to ingest high volume of data from a variety of sources
  • Develop real-time analytical data pipeline based on Kafka and Kinesis streaming platforms
  • Expand the feature set of a large-scale streaming data platform based on Spark, Kafka and AWS
  • Rely on software engineering best practices to deliver production quality software components under an agile software development approach
  • Leverage technological advancements of the fast-growing big data ecosystem to develop insightful, cutting-edge data products

Basic Qualifications

  • 8+ years programming experience
  • Bachelor degree or Master degree in computer science, or equivalent work experience
  • Experience in software development of large-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem
  • Good current knowledge of Unix/Linux operating systems
  • Test-driven development/test automation, continuous integration, and deployment automation
  • Enjoy working with data data analysis, data quality, reporting, and visualization
  • Good communicator, able to analyze and clearly articulate complex issues and technologies understandably and engagingly
  • Great design and problem-solving skills, with a strong bias for architecting at scale
  • Adaptable, proactive and willing to take ownership
  • Keen attention to detail and high level of commitment
  • Comfortable working in a fast-paced agile environment

Preferred Qualifications

  • Application development in AWS
  • 5+ years of JVM based programming (Java and Scala)
  • Collection, transformation and enrichment with computing frameworks such as Spark
  • Messaging middleware or distributed queuing technologies such as Kafka
  • Understanding and/or experience with serialization frameworks such as Avro, Thrift, Google Protocol Buffers, and Kyro
  • Understanding of container technologies (Docker/Kubernetes)
  • MapReduce experience in Hadoop utilizing Pig, Hive, or other query/scripting technology
  • Distributed (HBase or Cassandra or equivalent) or NoSQL (e.g. Mongo) database experience
  • Good understanding in any: advanced mathematics, statistics, and probability

Comcast is an EOE/Veterans/Disabled/LGBT employer