as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”) Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
perform data quality checks in a methodical manner to understand how to accurately utilize client data. Expert challenges of advanced data manipulation, complicated programming logic, and large data volumes is required Provides solutions for data driven applications involving large and complex data and providing reconciliation from source to end reporting data marts. Design Conceptual and physical data model for a global Datawarehouse practices/standards. Build monitoring and testing mechanisms on data transformations Continuous improvements on AWS in
ADMINISTRATION
years' experience in Contracts and Subcontracts administration for an EPC/Contractor Company, preferably with
management, shrinkage, general housekeeping and administration in line with merchandising, SAPC regulations
management, shrinkage, general housekeeping and administration in line with merchandising, SAPC regulations