Used Tools & Technologies
Not specified
Required Skills & Competences ?
Grafana @ 4 Kafka @ 4 Kubernetes @ 4 DevOps @ 3 Python @ 7 SQL @ 7 Spark @ 4 dbt @ 4 ETL @ 6 Airflow @ 4 NoSQL @ 7 CI/CD @ 3 Flink @ 4 Machine Learning @ 4 Azure @ 4 Data Engineering @ 4 Mentoring @ 4 Dashboarding @ 4 ELT @ 6 BI @ 4 Databricks @ 4 Power BI @ 4 Snowflake @ 4Details
Join a purpose-driven team where your data engineering expertise directly accelerates Eneco’s mission to become the first CO₂-neutral energy company by 2035. Work with cutting-edge cloud technologies and diverse data sources—from smart meters to wind turbines—to build impactful, scalable data solutions from the ground up. Grow in a collaborative environment that values continuous learning, shared ownership, and technical excellence while shaping the future of data-driven innovation.
Responsibilities
- Architect, develop, and optimize cloud-based data platforms and pipelines.
- Support Machine Learning Engineers, Data Scientists, and Data Analysts by preparing and making high-quality data easily accessible.
- Collaborate with stakeholders to gather and discuss project requirements.
- Design and implement data solutions to handle, store, process, and utilize data from various sources (customer data, smart meters, wind turbines, weather, markets, internal processes).
- Deploy, monitor, and manage pipelines and applications in production.
- Write well-documented code, perform and review code reviews, and follow software engineering best practices.
- Provide technical guidance and mentoring to other engineers; lead new projects from inception.
Requirements
- Senior-level experience: the description references a Senior Data Engineer role and requires substantial experience.
- Experience: 5+ years as a Data or Software Engineer (note header also shows 3-5 years, but the requirements specify 5+ years).
- Proficiency in ETL/ELT processes, orchestration, data warehousing, monitoring and alerting, system design and architecture.
- Strong knowledge of data modeling, data governance, data quality, access control, and documentation.
- Familiarity with software engineering best practices: code reviews, version control, CI/CD, incident management, and DevOps principles.
- Experience with distributed data processing frameworks (examples given: Spark, Beam, Flink, Kafka).
- Advanced proficiency in Python and SQL; experience with SQL and NoSQL databases.
- Hands-on experience with Databricks, Snowflake, dbt, Airflow, and Kubernetes; cloud deployments preferably in Azure.
- Experience with dashboarding tools (e.g., Power BI, Grafana).
What we offer / Benefits
- Gross annual salary between €82,000 and €115,000 (including FlexBudget and 8% holiday allowance; depending on role, a bonus or collective profit sharing may apply).
- FlexBudget options: payout, extra holiday days, or save for later.
- Personal and professional growth opportunities; support for development.
- Hybrid working model: 40% at the office, 40% from home, and 20% flexibly. Manager-approved work abroad allowed up to 3 weeks/year (max 2 consecutively).
- Work in a collaborative, purpose-driven environment contributing to the energy transition and Eneco’s CO₂-neutral goal by 2035.
Location & Process
- Office location: Eneco — Rotterdam, Netherlands. Hybrid working as described above.
- Contact information for recruitment and questions provided in the original posting.