JHB001726-KK-1 Join our IT Hub South Africa as a Data Scientist-AI Platform, focusing on innovative AI
Africa as an Entry-Level Network Engineer. You'll be part of a dynamic team focusing on Data Centre Networks will: - Implement Data Centre Networks based on Cisco ACI & Nexus - Troubleshoot Data Centre networks
and support test case creation - Coordinate test data creation with the developers and test analysts - “lessons learnt” sheet related to test topics - Ensure entry/exit criteria for all test phases are met in time
as an AWS Data Engineer (Expert) and leverage your skills in building and maintaining Big Data Pipelines expertise in Python, SQL, and AWS services to drive our data initiatives forward. Be part of a collaborative where your contributions ensure accurate and secure data management. ESSENTIAL SKILLS REQUIREMENTS: - Technical ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - CDEC Blueprint - Any ADVANTAGEOUS SKILLS REQUIREMENTS: - Expertise in data modelling with Oracle SQL. - Exceptional analytical
an Expert AWS Data Engineer in South Africa. Design, implement, and optimize Big Data Pipelines using using AWS services. Ensure data integrity, security, and compliance while collaborating with teams to integrate expertise in AWS technologies and data engineering to drive impactful data solutions. Key Responsibilities: implement, and optimize Big Data Pipelines using AWS services. - Ensure data integrity, security, and compliance documentation. - Conduct thorough testing and validation of data solutions. - Stay updated with industry trends and
Description: AWS Data Engineer (Expert) Join our dynamic IT Hub in South Africa as an AWS Data Engineer (Expert) and manage cutting-edge Big Data pipelines, ensuring secure and efficient data sharing. Work with top technologies collaborate in agile teams to deliver impactful data solutions. Ideal candidates have a relevant degree Develop and maintain data pipelines using AWS technologies. Implement and manage Big Data solutions. Collaborate Collaborate within Agile teams to deliver high-quality data solutions. Essential Skills: Proficiency in Terraform
you'll leverage your expertise in Python programming, data intelligence, and business intelligence to develop solutions. Key responsibilities include building data pipelines, supporting business intelligence processes Responsibilities: Develop and maintain data solutions using Python, focusing on data pipelines (ETL, SQL). Support of concepts (POCs). Required Skills: Expertise in Data Intelligence and Business Intelligence. Proficiency Proficiency in Python programming and building big data pipelines (ETL, SQL). Knowledge of Qlikview and awareness
Project Management - Good knowledge of the role of data and the use of technology in managing tax - Working guidelines and ensure timely tax documentation. - Define data needs of the tax team and continue the development Group-wide strategies. - Deliver tactical and strategic data and digital tax solutions set out in the BMW tax
and Terraform. Experience in data-driven programming, SQL, and big data (PySpark). AWS certifications
providing input on benefits and risks. Prepare test data and support testing activities, including unit testing process owners. Prepare cut-over strategy, including data migration planning. Support pre and post Go-Live