Hadoop/Spark/Scala/Java - Big Data Engineer - Copen
Harvey Nash Consulting (Scotland) Limited
Copenhagen Municipality, Copenhagen
Fantastic opportunity for a Big Data Engineer (Hadoop / Spark) to Relocate to Copenhagen - Fantastic Salary - Sponsorship and Relocation Assistance available!
Harvey Nash are recruiting a number of Data Engineers with experience of Hadoop/Spark/Java/Scala/Python, etc. on behalf of our highly successful client based in Copenhagen who are at an exciting new phase of developing their Big Data capability to power defence in Risk, Fraud and Financial Crime. This is a truly outstanding opportunity for a Degree educated, English speaking Data Engineer to work in the beautiful city of Copenhagen and work on our clients next generation Big Data Platform. You will have extensive experience in software development with Agile methodology, continuous integration and automated releases. Experience in engineering (commercial or open source) software platforms and large-scale data infrastructures is a must.
Responsibilities of the role;
Building distributed and highly parallelized Big data processing pipeline which process massive amount of data (both structured and unstructured data) in near realtime
Leverage Spark to enrich and transform corporate data to enable searching, data visualization, and advanced analytics
Working closely with analysts and business stakeholders to develop analytics models
Continuous delivery on Hadoop and other Big Data Platforms
Automate processes where possible and are repeatable and reliable
Work closely with QA team, release engineers and Product Management to deliver software in a continuous delivery environment
Extensive experience in Hadoop/Spark development
A complex problem solver
Data driven thinking
Excellent communication skills and experience of distributed global teams
Solid grounding in Financial Service
Experience with Spark Streaming and Kafka in large scale enterprise environment is highly desired
Experience of implementing data security capabilities such as encryption, anonymisation etc.
Technical skillset should include a mix of the following;
Atlassian Suite: Bamboo, JIRA, Fisheye, Sonar, Git/Stash
Test framework: Robot, jMeter, jUnit, dbUnit
Configuration management: Puppets / Chef
Hadoop: YARN, Ambari, MR, Sqoop, Flume
Streaming technologies; Spark, Storm, Ni-Fi
Hadoop security: AD, Knox, Ranger, Kerberos, Sentry
Java, Scala, Python
This is a once in a lifetime opportunity to embrace a new culture and be part of our clients ambitious technical journey. Our client are offering a fantastic package plus relocation assistance. Please forward your CV for immediate consideration.
Register and Apply
Login and Apply