Role Title: Spark/Scala Developer
Location: London - Days on site: 2-3
2 months
£308
- Expertise on Spark & Scala
- Experience in developing complex data transformation workflows(ETL) using Big Data Technologies
- Good expertise on HIVE, Impala, HBase
- Hands on experience to finetune Spark jobs
- Experience with Java and distributed computing
- In-depth comprehension of Big data/Hadoop technologies, distributed computing, and data processing frameworks
- Exceptional analytical and problem-solving skills, focusing on innovative and efficient solutions.
- Demonstrable experience in optimizing and fine-tuning big data applications for heightened performance and efficiency. -
- Hands-on experience with relevant tools such as Apache Hadoop, Spark, Kafka, and other industry-standard platforms.
- Good to have : External technology contributions (Noteworthy Open Source attributions).
