Vacancy is archived. Applications are no longer accepted.

Data Engineer - Network Analytics Platform

MIDDLE
✅ Hybrid

🕙 36 hours per week

SCRAPED

Used Tools & Technologies

Not specified

Required Skills & Competences ?

Security @ 3 Docker @ 3 Kubernetes @ 3 Python @ 3 ETL @ 3 GitHub @ 3 CI/CD @ 3 Azure @ 3 Communication @ 3 Data Engineering @ 3 API @ 3

Details

This is what we offer you

  • Gross monthly salary between EUR 4,331 and EUR 6,186 (scale 09) for a 36 hour week
  • Thirteenth month's salary and 8% holiday allowance
  • 10% Employee Benefit Budget
  • EUR 1,400 development budget per year
  • Hybrid working: balance between home and office work (possible for most roles)
  • A pension, for which you can set the maximum amount of your personal contribution

View all our benefits.

As a Data Engineer in our Graph Analytics Team, you’ll play a critical role in building an Enterprise Knowledge Graph to be utilized by teams across the bank. Whether it’s designing an ontology to best fit the use cases of the bank or writing queries to extract valuable insights from a graph database**, ** your work will enable more accurate insights and better decision-making. You’ll leverage graph analytics and data engineering to design scalable pipelines to integrate datasets from many different sources into one graph, providing a holistic, unified overview of our customers and products.

Collaborating with a multidisciplinary team, you’ll have the opportunity to develop a data product which emphasises the connections between entities as well as the entities themselves. You play a key role in enabling the right customer data, at the right time in the right format with the right quality to our range of internal users. Creating a scalable knowledge graph that can be accessed via network features, APIs or GraphRAG is a challenge you are eager to accept!

Practical examples

  • Build pipelines to construct graphs then extract insights from them
  • Develop streaming capabilities for building a 'live' graph
  • Challenge the team to make improvements to our codebase

Facts & figures

  • 36 or 40 hours a week.
  • Mostly greenfield, limited legacy work.
  • 43,822 Rabobank colleagues around the world.

Top responsibilities

  • Collaborate with data scientists & software engineers to ensure seamless ETL pipelines and scalable processing of graph data.
  • Use your knowledge of CI/CD & testing to improve the quality of our product
  • Take requirements from stakeholders and identify what data is needed to be added to the graph, as well as how to model it effectively in a scalable, structured way.

Together we achieve more than alone

We believe that bringing together people's differences is what makes us an even better bank. Talking of Rabobank: We are a Dutch bank that operates in 38 countries for over 9,500,000 customers. Together with these customers, our members, and partners we stand side by side to create a world in which everyone has access to enough healthy food. In the Netherlands we work to create a country in which people are happy with how they live, work, and do business.

Within Rabobank, Enterprise Analytic Products is an extremely ambitious department that aims to mix in a healthy dose of experience, innovation, and creativity to develop innovative products and services. We centrally create re-usable analytical building blocks, to make an impact across the bank. We focus on fast learning, delivering value quickly and being more effective. As a team, you work on clear goals and continuously see the results of your (team) efforts.

You and your talent

  • At least 2 years of experience building ETL pipelines in production.
  • You have some demonstrable experience with Python, GitHub, and Azure Cloud implementations.
  • Experience with cloud platforms (preferably Azure) and containerization tools (Docker, Kubernetes).
  • You have experience in unit testing, integration testing, and end-to-end testing, with a focus on writing clean, maintainable, and testable code that adheres to industry best practices.
  • Competence in building CI/CD pipelines for automated model deployment and testing.
  • You have some demonstrable experience with Python, GitHub, and Azure Cloud implementations.
  • You are curious by heart and have experience with collaborating with customers.
  • You have particularly effective communication skills in the English language (both written and spoken).
  • A passion for learning and applying new technologies.
  • A big plus is to have hands-on experience with graph analytics (e.g. Neo4j) and having some experience with building Gen AI applications (GraphRAG, prompting, OpenAI API, agents, etc.).
  • Another big plus is having experience with utilizing ontologies in structuring knowledge graphs.

You and the application process

  • Any questions about working at Rabobank and the process? Raphaël Drenthel, IT Recruiter, via [email protected]
  • We will hold the interviews through a video call.
  • You can find answers to the most frequently asked questions on rabobank.jobs/en/faq
  • A security check is part of the process.
  • We respect your privacy.

#LI-RD2