Used Tools & Technologies
Not specified
Required Skills & Competences ?
Kubernetes @ 4 DevOps @ 4 Python @ 4 SQL @ 4 CI/CD @ 4 Azure @ 4 Data Engineering @ 6 Git @ 7 Mentoring @ 4 PostgreSQL @ 4 Performance Optimization @ 4 Azure DevOps @ 4 Snowflake @ 4Details
Play a pivotal role in shaping the data backbone of Eneco’s CO₂-neutral future, directly enabling smarter, more efficient heat asset performance at scale.
Lead the transition from experimental data solutions to production-grade platforms, influencing architecture, ownership, and long-term technical direction within APO. Work at the intersection of deep engineering and real-world energy impact, with autonomy to define scalable data systems in an evolving, high-trust environment.
Responsibilities
- Lead improvements to APO data ingestion and platform architecture.
- Drive the transition of POCs and project-based solutions into structured run & maintain environments.
- Support strategic discussions on platform ownership, technical direction, and scalability.
- Ensure APO data products are robust, observable, and production-ready.
- Contribute to technical decision-making around Snowflake architectures, Kubernetes-based services, Python backend applications, and database design.
- Mentor engineers and promote best practices across the team.
- Ensure reliability and observability of Kubernetes-based services and pipelines.
- Drive technical governance, documentation, and quality standards.
Requirements
- 5+ years of experience as a Data Engineer or Backend Engineer in operational or data-intensive environments.
- Advanced Python expertise with strong application of software engineering best practices.
- Expert SQL knowledge with Snowflake and PostgreSQL.
- Strong Git expertise including advanced workflows and branching strategies.
- Experience designing CI/CD pipelines and DevOps practices (preferably Azure DevOps).
- Solid Kubernetes experience and understanding of container orchestration concepts.
- Strong architectural mindset with experience in evolving system landscapes and product transitions.
- Comfortable taking initiative in environments where structure and ownership are still being defined.
You’ll be responsible for
- Leading the technical evolution of APO data platforms and backend services.
- Designing scalable and maintainable data architectures for operational heat asset analytics.
- Maintaining and optimizing Snowflake data models and PostgreSQL databases.
- Ensuring reliability and observability of Kubernetes-based services and pipelines.
- Translating complex stakeholder needs into robust and future-proof solutions.
- Guiding and mentoring engineers and contributing to knowledge sharing.
This is where you’ll work
You will join Eneco’s Asset Performance Optimization (APO) team, where data and engineering directly support the transition to a CO₂-neutral energy system. Collaborate closely with business and technical stakeholders to transform complex operational challenges into scalable, production-ready data platforms that monitor and optimize heat assets in real time.
Benefits
- Gross annual salary between €83.000 and €117.000 (including FlexBudget, 8% holiday allowance, and depending on your role a bonus or collective profit sharing).
- FlexBudget: paid out, used to buy extra holiday days or saved.
- Personal and professional development support.
- Hybrid working: 40% at the office, 40% from home, and 20% flexibly. With manager approval, you may work abroad (within approved countries) up to 3 weeks/year, max 2 consecutively.
- Flexible schedule and work-life balance, opportunities to contribute to Eneco’s mission of climate neutrality by 2035.
Application process & contacts
Contact the recruiter for more information (contact details provided in the original posting).