Big Data DevOps Engineer - Java, Hadoop

  • Location:

    London, England

  • Sector:

    IT

  • Job type:

    Contract

  • Salary:

    Negotiable

  • Job ref:

    BBBH96374_1552391512

  • Published:

    9 days ago

  • Duration:

    5 months

  • Start date:

    ASAP

Big Data DevOps Engineer

Croydon

5 months initially

Pay - Competitive

This role requires candidates to be SC Cleared or eligible for SC Clearance

Client

My client, a global IT Services Consultancy, are looking for a Big Data DevOps Engineer to join them in Croydon, working onsite with a public sector client for a key project.

Role

  • New transformational program for a UK customer focusing on Big Data and advanced visualization capabilities
  • Agile mode of working with teams co-located in London
  • A sound DevOps expertise on CI/CD pipeline within Big Data technology landscape and ability to articulate and influence leadership stakeholders on infrastructure automation are pre-requisites. In addition to core DevOps expertise, hand-on experience in AWS, Hortonworks, DynamoDB, S3, Java 8, Hbase, Kafka, JanusGraph, HDF (NiFi) and related big data technology elements are a must to have. A minimum of 4 years demonstrable experience is required.

Skills

  • Experience with infrastructure automation tools like Ansible, Terraform, SaltStack, Puppet and Docker.
  • AWS Cloud Setup & Administration
  • Experience in DynamoDB
  • Strong hands on skills in setting up Linux-based infrastructure.
  • Fluency in languages including Python & Java
  • Expertise in CI tools such as Jenkins/Nexus
  • Experience with Java Build Systems, primarily Maven, Gradle.
  • Need to support code versioning and branching strategies using Github
  • Configuration management, backup/restore policies via specialized tools (Puppet) and custom BASH scripts.
  • System Monitoring solution like Zabbix, Promotheus and Nagios
  • Log management via Elastic Search and Logstash (ELK)
  • Hadoop Cluster administration, maintenance, capacity planning and optimization
  • Securing Hadoop (HDP essential) Infrastructure using knox and kerberos

Suitable candidates should submit their CV in the first instance

Apply Save job Create Job Alerts

Share this job