Data Engineer - Level Intermediate - Deadline 22/01/26
freelance experiencedJob Overview
Build and maintain scalable data pipelines and ETL processes in cloud environments (AWS, Azure).
Integrate data from multiple sources to support analytics, business intelligence, and ICS2/SSA project needs.
Ensure data quality, security, and compliance with project and EU requirements.
Work with data scientists and analysts to deliver clean, well‑structured datasets.
Optimize data storage and data retrieval for performance and cost efficiency in cloud platforms.
Support the design, implementation, and daily operations of the data and metadata processing parts of the cloud solution.
Make sure all stakeholders can access data securely and efficiently according to their roles.
Provide support and training on data processing tools and workflows.
Develop and implement data cleansing, data preparation, and other data processing solutions using project tools.
Contribute to data modelling, interface definitions, and technical documentation.
Support the design and use of the SSA platform, focusing on data integration, storage, accessibility, processing, and security.
Participate in analytics use cases, testing scenarios, and platform tool deployment.
Document operational procedures and technical specifications (interfaces, data models, etc.).
Design or support the design of data processing algorithms for specific use cases.
KNOWLEDGE AND SKILLS:
Strong knowledge of: Kubernetes, Docker, Cloudera, Spark, Kafka, Microservices, Relational DBMS, REST APIs, AWS, Azure
Excellent understanding of ETL processes and data quality best practices.
Ability to quickly take responsibility even with limited documentation.
Strong communication skills for both technical and non‑technical audiences.
Ability to produce clear, structured technical documentation.
Strong analytical and problem‑solving skills.
Ability to work in fast‑changing big‑data environments.
SPECIFIC EXPERTISE:
Good knowledge of microservices and cloud architecture.
Good knowledge of application design.
Excellent knowledge of Relational DBMS.
Good understanding of interoperability technologies (web services, message‑oriented middleware, service bus, event architecture).
Knowledge of high‑availability systems, big data, analytics solutions, and ideally machine learning platforms.
Experience with tools such as Oracle, SAP, Dataiku, Denodo, Kafka, Superset, LDAP, Spark, Presto, etc.
Good understanding of security (system and data security).
Knowledge of programming languages and network protocols.
Excellent knowledge of XML, HTML, JSON.
Level : Intermediate
Delivery mode: Near Site (Brussels)
Deadline 22/01/26
Make Your Resume Now