Ensures that the required data collection sheets are filled out (master data & customizing) · Ensures
Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash BMW Cloud Data Hub (CDH) BMW CDEC Blueprint · Any expertise in data modelling Oracle SQL. - Exceptional analytical skills analysing large and complex data sets Perform thorough testing and data validation to ensure the accuracy of data transformations. - Strong written Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. · Experience working with Data Quality building data pipeline using AWS Glue or Data Pipeline, or similar platforms. · Familiar with data store
Reference: JHB009406-BG-1 AWS Data Engineer ESSENTIAL SKILLS REQUIREMENTS: Exceptional experience/understanding Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with Enterprise Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality complex data sets. - Perform thorough testing and data validation to ensure the accuracy of data transformations building data pipeline using AWS Glue or Data Pipeline, or similar platforms. - Familiar with data store
and support test case creation · Coordinate test data creation with the developers and test analysts · plan to Agile Master · Act as single point of contact between Developers and Testers · Setup and track
Frontend, Backend and Integration testing. · Test data management. · Manual, Performance, security and Identification, Creation & Sanitation of Test Data · Security and Reliability Testing. · Technical management and maintenance and preparation of test data. · Interpretation of Testing Results and logging
SCSS · Java 8 · Spring Framework (Spring Boot, MVC, Data/JPA, Security etc.) · AWS stack such as Kinesis Principles · Design patterns · Clean coding principles · Data structures and Algorithms · Jenkins (CI/DevOps)
· Any operating system certification relating to data management · Any programming certification · Fluent