as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”) Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
business intelligence. Design and map data models to shift raw data into meaningful insights. Utilize Power with apt objectives Analyse previous and present data for better decision-making Transform business requirements publications Build multi-dimensional data models Develop strong data documentation about algorithms, parameters Power BI Define and design new systems Take care of data warehouse development Make essential technical and
input in terms of benefits and risks. Preparing test data for testing of user stories Execute and/or support process owners Preparing cut-over strategy, e.g., data migration Go-Live preparation and post Go-Live Support
experience Architecture: Cloud, On-prem, hybrid, data modelling, SW-Architecture