Accessibility Links

Big Data Development Lead

Expired
  • Salary: Negotiable
  • Job type: Contract
  • Location: London
  • Sector: IT
  • Date posted: 17/07/2017
  • Job reference: J370461A

We're really sorry, but it looks like this job has already been filled.

Register your CV with us, see our latest jobs or use the search below.

Big Data Development Lead

London (Croydon)

24 months

£450p/d

SC Cleared

Job Summary:

  • New transformational program for a UK customer focusing on Big Data and advanced visualization capabilities
  • We are seeking a talented and passionate Big Data Lead with experience in Hadoop with a deep understanding of Object-Oriented design and extensive experience in Java, Big Data and Hadoop technologies.
  • We are looking for expertise in Java, Scala, Kafka, Samsa, HBase, HDFS (preferably Hortonworks Distribution) and Python with a focus on distributed, highly scalable and high performance computing.
  • The role requires a Big Data Lead to work along with Big Data Engineer to work on the continuing design, build and commission and test of our clients Big Data Platform.
  • SC Clearance or being eligible for SC clearance or CTC clearance

Key Responsibilities:

  • Manage team between 10-20 people
  • Develop Data Ingestion Pipelines with Java/Scala, Kafka, Samsa and HBase.
  • Agile Development
  • 3rd Line Support
  • Continuous Integration
  • Unit Testing & Documentation

Job Requirements:

Essential Skills:

  • Past experience of managing Big Data team
  • Hadoop/HDFS (Hortonworks Distribution)
  • HBase, Kafka, Kerberos, Drools
  • Graph DBs (Ideally Datastax Enterprise, TinkerPop, Gremlin)
  • Big data architecture, Real Time data processing, Spark, Kafka, Samsa
  • Solr
  • Java 8
  • Scala
  • ZooKeeper
  • Spring/Spring Boot and spring data rest
  • REST
  • TDD
  • BDD (specifically Cucumber)
  • Strong Communications Skills
  • Microservices architecture
  • Experience working within a Scrum team and be familiar with
  • CI development techniques and environments.
  • AWS Cloud

Nice to Have Skills:

  • Oracle, Postgres
  • Apache Camel, Knox, Ranger
  • Enterprise Integration Patterns
  • Kubernetes
  • Docker
  • Ansible, Terraform
  • JIRA
  • Confluence
  • Experience in Dynamo DB and nice to have Datastax Graph, Gremlin
  • Government experience
  • Knowledge of networking, DevOps,
  • Enterprise Business Intelligence and Analytics
  • Demonstrate ability to develop, implement and maintain a set of Java-based operations and data services that provides a transactional and analytical platform.
  • This requires the use of
    • Open source frameworks and technologies such as Java 8, JBoss RESTEasy, Tomcat and Google Guice, and
    • CI and containerisation technologies such as Ambari, Puppet, Jenkins, Docker and Kubernetes
Similar jobs
View more similar jobs