Enterprise Data Architect – Brussels
Job Reference: CWS-736-0407
Location: Brussels
Type: Contract
Start date: ASAP
End date: 170 working days + possibilities of extension
International client based in Brussels is seeking an Enterprise Data Architect to join the team on a hybrid working contract assignment. Initial contract is for 170 working days with real possibilities of renewal. 5 days per month on site, candidates must live maximum 2 hours driving distance from Brussels. Candidates must be EU citizens for this role.
Key tasks and responsibilities:
- Creation and maintenance of Enterprise Data Warehouses (EDW) and complex Business Intelligence Solutions (Data Lakes / Data Lakehouses).
- Designing architectures both on the cloud and on premises (whether hosting or cloud on-premises) or combination of both (hybrid).
- Design a modern scalable data platform to replace a large legacy data system in a phased approach.
- Align architectural decisions with data governance policies and the department’s vision on cloudification.
- Design of data quality, audit and health monitoring processes.
- Define best practices, standards, rules, policies, review processes.
- Provide guidance and mentorship to data analysts, data engineers.
- Facilitate change management by guiding colleagues and users through the migration process.
- Document and maintain data architecture and data assets in detail.
Required skills and experience:
- Master Degree in IT or relevant field and 17 years of professional experience in IT.
- Experience in migrating legacy data systems to a modern data platform.
- Excellent knowledge of designing scalable and flexible modern data architectures.
- Excellent knowledge of business intelligence reporting tools.
- Strong understanding of cloud data architecture principles.
- Knowledge of database systems, both relational (PostgreSQL, Oracle) and non-relational (Elasticsearch, MongoDB).
- Experience with ETL/ELT processes, API integration, data ingestion and transformation tools (dbt, Spark, Talend, Fabric, SAP DS).
- Proficiency in data pipeline orchestration tools (Airflow, Dagster).
- Knowledge of data governance frameworks and tools (DataHub, Open Metadata, Atlas, Collibra), data quality management, data security, access control and regulatory compliance.
- Experience with DevOps practices and tools related to data pipelines, including CI/CD for data infrastructure.
- Excellent communication skills to articulate data architecture concepts to technical and non-technical stakeholders.
- Good knowledge of modelling tools.
- Fluent in English at least at a level B2.