Big Data DevOps Engineer - Java, Hadoop

  • Location

    London, England

  • Sector:


  • Job type:


  • Salary:


  • Contact:

    James Mcgonnell

  • Contact email:

  • Job ref:


  • Published:

    2 months ago

  • Duration:

    5 months

  • Expiry date:


  • Start date:


Big Data DevOps Engineer


5 months initially

Pay - Competitive

This role requires candidates to be SC Cleared or eligible for SC Clearance


My client, a global IT Services Consultancy, are looking for a Big Data DevOps Engineer to join them in Croydon, working onsite with a public sector client for a key project.


  • New transformational program for a UK customer focusing on Big Data and advanced visualization capabilities
  • Agile mode of working with teams co-located in London
  • A sound DevOps expertise on CI/CD pipeline within Big Data technology landscape and ability to articulate and influence leadership stakeholders on infrastructure automation are pre-requisites. In addition to core DevOps expertise, hand-on experience in AWS, Hortonworks, DynamoDB, S3, Java 8, Hbase, Kafka, JanusGraph, HDF (NiFi) and related big data technology elements are a must to have. A minimum of 4 years demonstrable experience is required.


  • Experience with infrastructure automation tools like Ansible, Terraform, SaltStack, Puppet and Docker.
  • AWS Cloud Setup & Administration
  • Experience in DynamoDB
  • Strong hands on skills in setting up Linux-based infrastructure.
  • Fluency in languages including Python & Java
  • Expertise in CI tools such as Jenkins/Nexus
  • Experience with Java Build Systems, primarily Maven, Gradle.
  • Need to support code versioning and branching strategies using Github
  • Configuration management, backup/restore policies via specialized tools (Puppet) and custom BASH scripts.
  • System Monitoring solution like Zabbix, Promotheus and Nagios
  • Log management via Elastic Search and Logstash (ELK)
  • Hadoop Cluster administration, maintenance, capacity planning and optimization
  • Securing Hadoop (HDP essential) Infrastructure using knox and kerberos

Suitable candidates should submit their CV in the first instance