Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Cloud Data Hub (CDH) CDEC Blueprint Experience Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality
Reference: JHB009406-BG-1 AWS Data Engineer ESSENTIAL SKILLS REQUIREMENTS: Exceptional experience/understanding Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with Enterprise Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality complex data sets. - Perform thorough testing and data validation to ensure the accuracy of data transformations building data pipeline using AWS Glue or Data Pipeline, or similar platforms. - Familiar with data store
performance and worked with Software installation Capture tools · Experience in optimisation of functional
performance and worked with Software installation Capture tools · Experience in optimisation of functional
environments to build a data base access monitoring solution for PostgreSQL data bases · In-depth knowledge and presentation skills. · Knowledge of data modelling and data visualisation tools. · Cloud Experience:
Frontend, Backend and Integration testing. · Test data management. · Manual, Performance, security and Identification, Creation & Sanitation of Test Data · Security and Reliability Testing. · Technical management and maintenance and preparation of test data. · Interpretation of Testing Results and logging
Frontend, Backend and Integration testing. · Test data management. · Manual, Performance, security and Identification, Creation & Sanitation of Test Data · Security and Reliability Testing. · Technical management and maintenance and preparation of test data. · Interpretation of Testing Results and logging
SDLC Previous exposure to Business Intelligence / Data Analytics Knowledge of Test Management and transitioning DevOps and Biz DevOps Understanding of AWS & Data Engineering processes Experience in programming
JHB009276-ZN-1 Our Client in the IT industry is looking for a Data Streaming Platform engineer. If you meet the requirements and managing applications on Kubernetes clusters Data modelling and Database technologies (relational
troubleshooting in infrastructure · Experience in data driven monitoring and analytics · Experience with problem remediation · Machine learning (ML) for data analytics · Network troubleshooting skills · Understand