Data Engineer (Energy Management)
🕙 32-40 hours per week
SCRAPED
Used Tools & Technologies
Not specified
Required Skills & Competences ?
Grafana @ 3 Kubernetes @ 3 DevOps @ 3 IaC @ 3 Python @ 5 Scala @ 3 SQL @ 2 Spark @ 3 dbt @ 2 Java @ 5 NoSQL @ 3 RDBMS @ 3 CI/CD @ 3 Machine Learning @ 3 MLOps @ 2 Data Science @ 2 AWS @ 3 Data Engineering @ 3 FastAPI @ 3 SRE @ 3 Debugging @ 3 API @ 3 Databricks @ 3 Snowflake @ 2Details
As a Data Engineer at Eneco you will be instrumental in setting up and leading technical decisions for our cloud-based data platform that supports consumer IoT products (smart thermostats, energy insight, smart charging). You will ensure real-time time-series data processing in Databricks (Scala) environments is robust and scalable, make data accessible and usable for other departments, and advise product teams on technical choices.
Responsibilities
- Ensure real-time time-series data processing in Databricks (Scala) environments is robust, scalable and production-ready.
- Set up projects and lead technical decisions involving cloud-based data solutions.
- Combine cloud infrastructure setup, API server maintenance, and develop streaming and batch data processing pipelines.
- Empower other departments by making data accessible and usable to drive Eneco's digital innovations.
- Provide technical advice and shape product direction together with the product manager and other stakeholders.
Requirements
Must have
- Previous experience with REST API development (examples given: Spring or FastAPI).
- Understanding of streaming data ingestion and processing.
- Experience with MPP data platforms such as Spark; Databricks experience and Unity Catalog are noted as a plus.
- Proficiency in programming languages: Java, Scala, and Python.
- Knowledge of software engineering best practices: code reviews, version control, testing, and CI/CD.
- Genuine interest in DevOps / SRE principles for production deployment.
Nice to have
- Experience with high-volume time-series data.
- Knowledge of data modelling and architecture patterns.
- Experience deploying applications to Kubernetes and skills in monitoring (Grafana) and debugging.
- Knowledge of cloud providers (example: AWS); Infrastructure as Code (IaC) is a plus.
- Experience with NoSQL databases (example: DynamoDB) and RDBMS (example: Postgres).
- Proficiency in SQL and DBT (Data Build Tool) and familiarity with Snowflake.
- Familiarity or interest in MLOps and data science techniques.
Team and workplace
You will work with other Data Engineers, Machine Learning Engineers, Data Scientists, and Data Analysts to shape IoT products that transform how consumers use their energy. The team encourages learning, collaboration, and continuous improvement.
Benefits
- Gross annual salary between €70.000 and €110.000 (including FlexBudget, 8% holiday allowance, and depending on role a bonus or collective profit sharing).
- FlexBudget: have it paid out, use it to buy extra holiday days, or save it.
- Personal and professional growth support from Eneco.
- Hybrid working: 40% at the office, 40% from home, and 20% flexible. With manager approval, working abroad is allowed up to 3 weeks/year (max 2 consecutively) within approved countries.
About Eneco
Eneco pursues the One Planet strategy with the ambition to be climate neutral by 2035. The Digital Core team focuses on modernizing chat, app, and web environments to deliver superior digital customer experiences that help customers become greener every day.
Application process and contact
For questions about the application procedure, contact recruiter Venetia de Wit (+31615850813).