Data Engineer

at ING
πŸ“ Amsterdam, Netherlands
πŸ“ Manila, Philippines
πŸ“ Bratislava, Slovakia
EUR 38,400 per year
MIDDLE
βœ… Hybrid
βœ… Visa Sponsorship

πŸ•™ 39 hours per week

SCRAPED

Used Tools & Technologies

Not specified

Required Skills & Competences ?

Docker @ 3 Kubernetes @ 3 Python @ 3 SQL @ 3 Spark @ 3 GCP @ 3 Airflow @ 3 NoSQL @ 3 CI/CD @ 3 Distributed Systems @ 3 AWS @ 3 Azure @ 3 Communication @ 3 Data Engineering @ 3 Mentoring @ 3 Hive @ 3

Details

Financial Markets Client Control (FMCC) is a department within Financial Markets (FM) Business Services (part of ING Wholesale Banking). The department consists of a Centre of Excellence (COE) in Amsterdam and two execution locations in ING Hubs Bratislava and Manila that provide services to 30 countries worldwide. You will be joining a team of senior data analysts, collaborating on a cutting-edge Data Warehouse solution that handles a diverse spectrum of trading and operational datasets critical to business intelligence.

Role overview

We are looking for a Data Engineer to join us in Bratislava. In this role you will design, build, and optimize complex data pipelines to process transactional and other data. You will play a key role in evolving the data platform and mentoring a highly analytical team. You will collaborate closely with stakeholders in Amsterdam and BAU teams in Bratislava and Manila to develop and maintain data solutions and data architecture.

Responsibilities

  • Design, build and optimize complex data pipelines for transactional and other datasets.
  • Contribute to and evolve the Data Warehouse and broader data platform.
  • Mentor and support a highly analytical team of data practitioners.
  • Collaborate with stakeholders in Amsterdam and BAU teams in Bratislava and Manila to deliver and maintain data solutions and architecture.
  • Work on provisioning/deployment automation and maintain reliability of data processing systems.

Requirements

  • 3+ years of experience in data engineering.
  • Hands-on experience building complex data pipelines.
  • Mandatory experience with Apache Airflow, Spark, and Python.
  • Experience in setting up and optimizing both SQL and NoSQL data stores (MS-SQL, Hive).
  • Familiarity with object storage services (e.g., S3).
  • Experience with deployment and provisioning automation tools (e.g., Docker, Kubernetes, CI/CD).
  • Collaborative mindset, curious and empathetic, committed to learning and development.

Nice to have

  • Experience with one or more major cloud providers (Azure, GCP, AWS).
  • Experience managing and developing distributed systems and clusters for batch processing.
  • Excellent communication skills and ability to explain technical topics to non-technical audiences.

Salary

  • Wage (gross): 3,200 EUR / month (this is the minimum/basic wage component; final conditions will be discussed and negotiated at interview, but not less than the stated minimum).

Working time & office policy

  • Full time. Daily working time: 7 hrs 45 min (flexible working hours and option to work from home).

Benefits

  • Flexible working hours and work from home.
  • Daily refreshments (coffee, fruits, drinks).
  • Access to fully equipped gym 24/7, game room, modern coffee corner and terrace with grill.
  • Sick leave compensation up to 80% of monthly salary.
  • Life insurance contribution, 3rd pillar contribution, 24/7 external mental health support.
  • Learning opportunities (internal and external programs), regular teambuildings and social events.
  • Extra 3 personal days per year; up to 10 extra vacation days/year; referral bonuses; childbirth/adoption and other one-off bonuses.
  • Yearly performance bonus, cafeteria points (50 EUR/month), green benefit reimbursement, home office contribution, discounts, welcome package, pet-friendly office.