Used Tools & Technologies
Not specified
Required Skills & Competences ?
Grafana @ 3 Kubernetes @ 3 DevOps @ 3 Python @ 5 SQL @ 5 Spark @ 3 dbt @ 3 ETL @ 3 Airflow @ 3 CI/CD @ 3 Machine Learning @ 3 Azure @ 3 Data Engineering @ 3 FastAPI @ 3 ELT @ 3 API @ 3 Databricks @ 3 Snowflake @ 3Details
As a Full-Stack Data Engineer at Eneco you will build scalable cloud data platforms and pipelines that support the company's mission to become CO₂-neutral by 2035. You will work hands-on across the full data lifecycle—ingesting, processing, modeling, deploying and monitoring data solutions—collaborating closely with Data Scientists, Machine Learning Engineers and business stakeholders to turn complex data into actionable insights and products.
Responsibilities
- Design, build and maintain cloud-based data platforms and end-to-end data pipelines.
- Process and leverage diverse data sources (customer data, smart meters, wind turbines, weather, markets, internal processes) to generate business insights and new products.
- Support Machine Learning Engineers, Data Scientists and Data Analysts by preparing and making high-quality data easily accessible.
- Deploy, monitor and manage data pipelines and applications in production.
- Write well-documented code and perform code reviews.
- Collaborate with stakeholders to gather and discuss project requirements and design solutions.
Requirements
Must have:
- 3–5 years of experience as a Data Engineer.
- Experience in data engineering responsibilities: ETL/ELT, orchestration, warehousing, monitoring and alerting, and applying best practices.
- Experience with software engineering best practices: code reviews, version control, CI/CD, monitoring and DevOps principles.
- Experience with cloud deployments (preferably Azure) and deploying applications.
- Experience with data platforms and tools such as Databricks, Snowflake, dbt, Airflow, Spark or equivalent.
- Experience with monitoring tools (e.g., Grafana).
- Proficiency in SQL and Python.
Nice to have:
- Experience in data modeling, data governance, data quality, access control and documentation.
- Experience with Kubernetes.
- API development experience (e.g., FastAPI).
Where you’ll work / Team
- You will join a cross-functional team in Enterprise IT Solutions working with Data Engineers, Machine Learning Engineers, Data Scientists and Analysts on cloud-based platforms transforming diverse data streams into insights and solutions.
- Hybrid working model: 40% at the office, 40% from home and 20% flexible. With manager approval, temporary work abroad is possible within approved countries (up to 3 weeks/year, max 2 consecutively).
Benefits
- Gross annual salary between €61,000 and €86,000 (including FlexBudget, 8% holiday allowance; depending on role a bonus or collective profit sharing).
- FlexBudget options: payout, buy extra holiday days, or save.
- Professional growth and development support.
- Hybrid working and flexible schedule supporting work-life balance.
Application process
- Apply via the careers website. Applications via email will not be considered.
- For questions contact the recruiter listed in the posting.