hiring an Entry-Level Data Scientist for our AI Platform team Use your skills in data analytics, machine Team as an Entry-Level Data Scientist Are you ready to kickstart your career in data science? The IT Hub Entry-Level Data Scientist to join our dynamic AI Platform team What You'll Do: - Leverage data analytics
Reference: JHB001787-GuguN-1 Job Description: The Data Scientist will focus on developing cost variance variance use cases as part of Exxaro Resources' Data & AI Strategy. This position entails working onsite statistical and machine learning techniques to drive data-driven decision making. - Present findings to both What We're Looking For: - Experience:4-6 years in data science roles. - Skills: Proficiency in Python, analysis and modeling techniques. Experience with data visualization tools (e.g., Tableau, Power BI). -
JHB001726-KK-1 Join our IT Hub South Africa as a Data Scientist-AI Platform, focusing on innovative AI
as an AWS Data Engineer (Expert) and leverage your skills in building and maintaining Big Data Pipelines expertise in Python, SQL, and AWS services to drive our data initiatives forward. Be part of a collaborative where your contributions ensure accurate and secure data management. ESSENTIAL SKILLS REQUIREMENTS: - Technical ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - CDEC Blueprint - Any ADVANTAGEOUS SKILLS REQUIREMENTS: - Expertise in data modelling with Oracle SQL. - Exceptional analytical
Description: AWS Data Engineer (Expert) Join our dynamic IT Hub in South Africa as an AWS Data Engineer (Expert) and manage cutting-edge Big Data pipelines, ensuring secure and efficient data sharing. Work with top technologies collaborate in agile teams to deliver impactful data solutions. Ideal candidates have a relevant degree Develop and maintain data pipelines using AWS technologies. Implement and manage Big Data solutions. Collaborate Collaborate within Agile teams to deliver high-quality data solutions. Essential Skills: Proficiency in Terraform
you'll leverage your expertise in Python programming, data intelligence, and business intelligence to develop solutions. Key responsibilities include building data pipelines, supporting business intelligence processes Responsibilities: Develop and maintain data solutions using Python, focusing on data pipelines (ETL, SQL). Support of concepts (POCs). Required Skills: Expertise in Data Intelligence and Business Intelligence. Proficiency Proficiency in Python programming and building big data pipelines (ETL, SQL). Knowledge of Qlikview and awareness
integrations that connect various software applications and data sources. This role involves collaborating with cross-functional document processes, and maintain compliance with data security regulations. Additionally, they will provide knowledge of APIs, web services (SOAP, REST), and data formats (XML/JSON). - Experience with programming platforms (AWS, Azure, Google Cloud). - Knowledge of data integration and ETL tools. - Familiarity with Agile
Engineer. You'll be part of a dynamic team focusing on Data Centre Networks, utilizing Cisco ACI & Nexus will: - Implement Data Centre Networks based on Cisco ACI & Nexus - Troubleshoot Data Centre networks
and Terraform. Experience in data-driven programming, SQL, and big data (PySpark). AWS certifications
providing input on benefits and risks. Prepare test data and support testing activities, including unit testing process owners. Prepare cut-over strategy, including data migration planning. Support pre and post Go-Live