Our client requires the services of a Data Engineer/Scientist ( Expert) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV. Experience working with Data Quality Tools
Our client requires the services of a Data Scientist/Engineer (Entry) – Midrand/Menlyn/Rosslyn/Home Office
JHB001726-KK-1 Join our IT Hub South Africa as a Data Scientist-AI Platform, focusing on innovative AI
JHB001765-Laka-1 Join our dynamic IT Hub South Africa as a Data Scientist on the AI Platform team. You will leverage excellent teamwork abilities, and a passion for AI and data analytics. ESSENTIAL SKILLS REQUIREMENTS: - Strong
Reference: JHB001766-Laka-1 Looking for a dynamic Data Scientist to join our team at the forefront of AI
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”) Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
based in Midrand, is urgently seeking a Back-Order Clerk to join their team. 2-3 years experience in a similar
based in Midrand, is urgently seeking a Back-Order Clerk to join their team. 2-3 years experience in a similar
Advanced Data Streaming Platform Specialist. Focus on innovative IT solutions in SAP, Java, Cloud, Data Analytics and managing applications on Kubernetes clusters - Data modeling and Database technologies (relational,
2. Utilise AWS services and API Gateway to build scalable and efficient data pipelines.
3. Collaborate with cross-functional Qualifications: