Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash BMW Cloud Data Hub (CDH) BMW CDEC Blueprint · Any expertise in data modelling Oracle SQL. - Exceptional analytical skills analysing large and complex data sets Perform thorough testing and data validation to ensure the accuracy of data transformations. - Strong written Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. · Experience working with Data Quality building data pipeline using AWS Glue or Data Pipeline, or similar platforms. · Familiar with data store
modules for Integration requirements. · Preparing test data and documentation, Conducting Unit tests, Regression meetings etc. · Ensure availability to work on weekends and public holidays when required · Daily use
differences. · Willingness and ability to work on weekends and public holidays on implementation and operations modules for Integration requirements. · Preparing test data and documentation, Conducting Unit tests, Regression
Reference: JHB009406-BG-1 AWS Data Engineer ESSENTIAL SKILLS REQUIREMENTS: Exceptional experience/understanding Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with Enterprise Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality complex data sets. - Perform thorough testing and data validation to ensure the accuracy of data transformations building data pipeline using AWS Glue or Data Pipeline, or similar platforms. - Familiar with data store
Azure or AWS or SAP BTP knowledge PostgreDB, Python data analysis Power apps or other low code tools SAP
Ensures that the required data collection sheets are filled out (master data & customizing) · Ensures
· Any operating system certification relating to data management · Any programming certification · Fluent