Data Ops Engineer

Location:
Birmingham
Job Type:
Permanent
Industry:
Enterprise Applications
Job reference:
BBBH230080_1718370092
Posted:
4 months ago

Data Ops Engineer

Role purpose:


The DataOps Engineer will work closely alongside Data Engineering, Data Science, Data Analytics and DevOps to ensure availability of our Azure data platform. The role will ensure best practices are followed focusing on automation, scalability and security. The main responsibilities include monitoring infrastructure and data pipelines, implementing and monitoring CI/CD pipelines, administering our Databricks Lakehouse platform including configuring and optimising clusters, and managing role-based access control.

Key Responsibilities:


Monitoring existing data pipelines, resolving issues and recommending areas for improvement
Monitoring our existing infrastructure ensuring it is available and scalable to meet the needs of the business
Creating and maintaining CI/CD pipelines in Azure DevOps
Administering Databricks/Unity Catalog - creating and managing clusters and setting up RBAC
Identifying infrastructure cost optimisations
Setting up linked services in Azure Data Factory

Technical / Professional Qualifications / Requirements:


Experience monitoring Azure Data Factory pipelines
Experience creating linked services in Azure Data Factory utilising Azure Key Vault
Good knowledge of Azure data services
Understanding of Data Engineering/Software Engineering best practices
Experience creating CI/CD pipelines in Azure DevOps
Databricks Workspace Administration
Understanding of implementing identify and access management
Setting up Unity Catalog (desirable)
Infrastructure as Code - Terraform (desirable)
Knowledge of cloud networking (desirable)

Back job search
Back to Search Results
.