Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Cloud Data Hub (CDH) CDEC Blueprint Experience Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality
fill this urgent role Role Title (Level): CAX VDI Engineer (Advanced) Product / Feature Team Information · Any operating system certification relating to data management · Any programming certification · Any VDI Deployment · Understanding CA Data Management / “Follow the Data” as an VDI deployment approach · management / asset management) · Defining specific engineering roles for VDIs matching their exact application Crash Test simulation) · Understanding global engineering (ItO) resource demands and consulting / designing
Reference: JHB009341-BG-1 Operations Engineer ESSENTIAL SKILLS REQUIREMENTS: Diploma / Degree in IT or to analyze and troubleshoot potential solutions. Data analytics and trend analysis Presentation skills
Reference: JHB009278-BM-1 Contract Starts: 01.06.2024 Contract Ends: 31.12.2026 Location: Midrand/Menlyn/Rosslyn/Home Office rotation ESSENTIAL SKILLS REQUIREMENTS: Create architecture components for Microservices Architecture, Cloud Architecture and Container Architecture Develop, test, and deploy
in the IT industry is looking for a Data Streaming Platform engineer. If you meet the requirements kindly and managing applications on Kubernetes clusters Data modelling and Database technologies (relational
Client in the IT industry is looking for a Software Engineer -Conversational AI, please if you meet the below advantage: Microsoft Certified: Azure Developer Associate AZ-203: Developing Solutions for Microsoft Azure
role. Role Title (Role Level): SAP Operations Engineer (Senior) Product / Feature Team Information (if Functional Specifications for them · Preparing test data for testing of CR's (Change Requests) · Testing
Reference: JHB009406-BG-1 AWS Data Engineer ESSENTIAL SKILLS REQUIREMENTS: Exceptional experience/understanding Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with Enterprise Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality complex data sets. - Perform thorough testing and data validation to ensure the accuracy of data transformations building data pipeline using AWS Glue or Data Pipeline, or similar platforms. - Familiar with data store
Frontend, Backend and Integration testing. · Test data management. · Manual, Performance, security and Identification, Creation & Sanitation of Test Data · Security and Reliability Testing. · Technical management and maintenance and preparation of test data. · Interpretation of Testing Results and logging
SDLC Previous exposure to Business Intelligence / Data Analytics Knowledge of Test Management and transitioning and Biz DevOps Understanding of AWS & Data Engineering processes Experience in programming, test