Data Engineer (Energy Management)
at Eneco
π Rotterdam, Netherlands
EUR 70,000-110,000 per year
π 32-40 hours per week
SCRAPED
Used Tools & Technologies
Not specified
Required Skills & Competences ?
Grafana @ 3 Kubernetes @ 3 DevOps @ 3 IaC @ 3 Python @ 5 Scala @ 3 SQL @ 5 Spark @ 3 dbt @ 5 Java @ 5 NoSQL @ 3 RDBMS @ 3 CI/CD @ 3 Machine Learning @ 3 MLOps @ 2 Data Science @ 2 AWS @ 3 Data Engineering @ 3 FastAPI @ 3 SRE @ 3 API @ 3 Databricks @ 3 Snowflake @ 5Details
As a Data Engineer you will play a crucial role in setting up and leading technical decisions for Enecoβs cloud-based data platform. You will contribute to a combination of cloud infrastructure setup, maintain API servers, and develop streaming and batch data processing pipelines for IoT products (smart thermostat, energy insight, smart charging). The role focuses on robust, scalable real-time time-series data processing in Databricks (Scala) environments and enabling other departments by making data accessible and usable.
Responsibilities
- Ensure real-time time-series data processing in Databricks (Scala) environments is robust and scalable.
- Set up projects and lead technical decisions involving real-time time-series data.
- Empower other departments by making data accessible and usable to drive Enecoβs digital innovations.
- Design and implement cloud solutions to meet product requirements.
- Maintain API servers and contribute to REST API development.
- Shape the product by providing technical advice to product managers and other team members.
- Ensure solutions are robust, scalable, and production-ready, applying DevOps/SRE principles.
Requirements
Must have:
- Previous experience with REST API development (e.g. Spring or FastAPI).
- Understanding of streaming data ingestion and processing.
- Previous experience with MPP data platforms such as Spark. Working experience with Databricks and Unity Catalog is a plus.
- Proficiency in programming languages: Java, Scala, and Python.
- Knowledge of software engineering best practices: code reviews, version control, testing, CI/CD.
- Genuine interest in DevOps/SRE principles for production deployment.
Nice to have:
- Working experience with high-volume time series data.
- Knowledge of data modelling and architecture patterns.
- Experience deploying applications to Kubernetes and using monitoring tools (e.g. Grafana).
- Experience with cloud providers (e.g. AWS) and Infrastructure as Code (IaC).
- Experience with NoSQL databases (e.g. DynamoDB) and RDBMS (e.g. Postgres).
- Proficiency in SQL and DBT (Data Build Tool) with Snowflake.
- Familiarity or interest in MLOps and data science techniques.
Other details from the posting:
- Experience level indicated: ~3β5 years.
- Working hours: 32β40 hours per week.
- Team: you will work with Data Engineers, Machine Learning Engineers, Data Scientists and Data Analysts to shape IoT products.
Benefits
- Gross annual salary between β¬70,000 and β¬110,000 (including FlexBudget, 8% holiday allowance, and depending on the role a bonus or collective profit sharing).
- FlexBudget (can be paid out, used to buy extra holiday days, or saved).
- Personal and professional growth support.
- Hybrid working: typically 40% at the office, 40% at home, and one flexible day.