Data Engineer Energy Management
π 32-40 hours per week
SCRAPED
Used Tools & Technologies
Not specified
Required Skills & Competences ?
Grafana @ 3 Kubernetes @ 3 DevOps @ 3 IaC @ 3 Python @ 5 Scala @ 3 SQL @ 5 Spark @ 3 dbt @ 5 Java @ 5 NoSQL @ 3 RDBMS @ 3 CI/CD @ 3 MLOps @ 2 Data Science @ 2 AWS @ 3 Data Engineering @ 3 FastAPI @ 3 SRE @ 3 Debugging @ 3 API @ 3 Databricks @ 3 Snowflake @ 5Details
As a Data Engineer, you will play a crucial role in setting up and leading technical decisions for our cloud-based data platform. We are specifically looking for someone that will contribute to the combination of Cloud Infrastructure setup, maintain API server, and develop streaming/batch Data processing pipeline. You will be working on an exciting IoT product (smart thermostat, energy insight, smart charging) for our consumers.
Responsibilities
- Ensure real-time time-series data processing in Databricks (Scala) environments is robust and scalable.
- Empower other departments by making data accessible and usable, contributing to Enecoβs digital innovations.
- Lead technical decisions on cloud-based data solutions and advise the product team.
- Design and implement cloud solutions to handle product requirements.
- Shape the product by providing technical advice to the product manager or other team.
- Ensuring our solutions are robust, scalable, and ready to meet future challenges.
Requirements
Must Have:
- Previous experience of REST API development (e.g. Spring or FastAPI).
- Understanding of streaming data ingestion and processing.
- Previous experience working with MPP data platforms such as Spark. Working experience of using Databricks and Unity Catalog is a plus.
- Proficiency in programming languages (Java, Scala, and Python).
- Knowledge of software engineering best practices: code reviews, version control, testing, and CI/CD.
- Genuine interest in DevOps/SRE principles for production deployment.
Nice to Have:
- Working experience with high-volume time series data.
- Knowledge of data modelling and architecture patterns.
- Experience deploying applications to Kubernetes, with skills in monitoring (Grafana) and debugging.
- Knowledge with cloud provider (e.g. AWS). Infrastructure as code (IAC) is a plus.
- Experience with NoSQL databases (e.g. DynamoDB) and RDBMS (e.g. Postgres).
- Proficiency in SQL and DBT (Data Build Tool) with Snowflake.
- Familiarity or interest with MLOps and data science techniques.
Benefits
Working at Eneco offers ambition, growth, and opportunities for personal development. You will have flexible working options and space to improve yourself while achieving a good work-life balance. Join us in our mission towards climate neutrality by 2035, helping our customers to become more sustainable.