Data Engineer

Job Type:
Cloud & Infrastructure
Job reference:
24 days ago

Data Engineer
6 Months
London 3 days a week
£340 Via Umbrella

Would you like to join a global leader in consulting, technology services and digital transformation?

Our client is at the forefront of innovation to address the entire breadth of opportunities in the evolving world of cloud, digital and platforms.

What You Will Be Doing

  • Shape the portfolio of business problems to solve by building detailed knowledge of internal data sources
  • Model data landscape, obtain data extracts and define secure data exchange approaches
  • Acquire, ingest, and process data from multiple sources and systems into Cloud Data Lake
  • Operate in the fast-paced, iterative environment while remaining compliant with bank's Information Sec
  • policies/standards
  • Collaborate with others to map data fields to hypotheses and curate, wrangle, and prepare data for use in
  • their advanced analytical models
  • Help architect the strategic advanced analytics technology landscape
  • Build reusable code and data assets
  • Codify best practices, methodology and share knowledge with other engineers in UBS

What We Need

  • Experience in software development, including a clear understanding of data structures, algorithms,
  • software design and core programming concepts
  • Comfortable multi-tasking, managing multiple stakeholders and working as part of agile team
  • Excellent communication skills including experience speaking to technical and business audiences and
  • working globally
  • Expertise in Spark & Distributed Datasets design patterns
  • Strong problem solving and analytical skills
  • Keen to learn and share new concepts


  • Meaningful experience in following technologies:

Scala, SQL

  • Experience and interest in Cloud platforms such as Azure (preferred) or AWS
  • Experience in Distributed Processing using Apache Spark
  • Ability to debug using tools like Ganglia UI, expertise in Optimizing Spark Jobs
  • The ability to work across structured, semi-structured, and unstructured data, extracting information and identifying linkages across disparate data sets
  • Expert in creating data structures optimized for storage and various query patterns for e.g. Parquet and
  • Delta Lake
  • Meaningful experience in at least one database technology such as:
    • Traditional RDBMS (MS SQL Server, Oracle)
    • NoSQL (MongoDB, Cassandra, Neo4J, CosmosDB, Gremlin)
  • Understanding of Information Security principles to ensure compliant handling and management of data
  • Experience in traditional data warehousing / ETL tools (Informatica, Azure Data factory)
  • Ability to clearly communicate complex solutions
  • Proficient at working with large and complex code bases (Github, Gitflow, Fork/Pull Model)
  • Working experience in Agile methodologies (SCRUM, XP, Kanban)

All profiles will be reviewed against the required skills and experience. Due to the high number of applications we will only be able to respond to successful applicants in the first instance. We thank you for your interest and the time taken to apply!

As a Data Engineer within the exciting, new Finance Risk And Data Analytics capability, you will be building big data solutions to solve some of the organisation's toughest problems and delivering significant business value.
This is a really exciting time to join as you will be helping to shape the Reference Data Mastering and Distribution architecture and technology stack within our new cloud-based datalake-house.

Back to Search Results