Senior DevOps Engineer Big Data

praca it: etat dla Akamai / Kraków

Tagi: devops bigdata kubernetes spark privatecloud

About the Job

Big Data: work at the heart of the Internet maintaining the next generation Big Data applications. This position offers the opportunity to work on extremely large scale, distributed, Big Data applications. Be part of the engineering team that’s defining state of the art in Big Data.

About the Team

The Platform Data Services team (PDS) owns and operates the Big Data systems that collect, process, aggregate and store Akamai network metrics for the purposes of analytics, reporting, visualization, decision support and provisioning. All PDS systems currently receive over 7PB/day of data and process over 5 trillion records per day.

The data intelligence group (DDI) within PDS has responsibility for design, development, and maintenance of Big Data applications. We handle terabytes of data that needs to be received, stored and presented in a reliable way.

Responsibilities:

-Work with tools for cloud management, leveraging advanced skills in data analysis, network diagnostics, and debugging tools.
-Apply and gain knowledge about related network protocols and distributed databases implementations to measure, visualize, analyze, characterize, and improve performance, robustness, availability, and scalability of the Akamai delivery platform.
-Enhance our mechanisms for incident prevention by proactive data analysis and monitoring.
-Work in all stages of software release process, from development, through testing, deployment to monitoring on all test and production environments.
-Work on projects that make our network more stable, faster, and secure.
-Work with our 3rd level engineering support to troubleshoot complex problems that cross teams specialization
-Develop skills to become an expert user of internal systems
-Reproduce and troubleshoot issues to determine root cause
-Provide support to other engineering groups who use our software
-Support and maintain multiple separated staging infrastructures

Basic Qualifications:

Education: Bachelor’s Degree in Computer Science, a related field or equivalent experience
2+ years of strong expertise working in a Linux or Unix environment
2+ years of experience working with Internet protocols including TCP/IP, HTTP, SSL.
Experience with Python, SQL
Highly responsible, self-disciplined, self-managed, self-motivated.

Desired Qualifications:

Masters in Computer Science or a related field.
Experience with virtualization and orchestration technologies
Experience with Linux kernel internals, Perl, make, PostgreSQL
Experience with NoSQL databases (Apache Cassandra in particular)
Experience with Go
Experience with Java debugging
Experience with exposing and consuming REST APIs



Jak składać oferty pracy

Ta oferta pracy została opublikowana ponad 60 dni temu...

« powrót na stronę główną
Jeżeli ta oferta pracy nie jest zgodna z regulaminem, powiadom nas!   
Poleć znajomemu
Opublikowana 2019-04-25
Wyświetlona: 6979 razy