hiring an Entry-Level Data Scientist for our AI Platform team Use your skills in data analytics, machine Team as an Entry-Level Data Scientist Are you ready to kickstart your career in data science? The IT Hub Entry-Level Data Scientist to join our dynamic AI Platform team What You'll Do: - Leverage data analytics
Reference: JHB001787-GuguN-1 Job Description: The Data Scientist will focus on developing cost variance variance use cases as part of Exxaro Resources' Data & AI Strategy. This position entails working onsite statistical and machine learning techniques to drive data-driven decision making. - Present findings to both What We're Looking For: - Experience:4-6 years in data science roles. - Skills: Proficiency in Python, analysis and modeling techniques. Experience with data visualization tools (e.g., Tableau, Power BI). -
integrations that connect various software applications and data sources. This role involves collaborating with cross-functional with data security regulations. Additionally, they will provide technical support and training to end-users knowledge of APIs, web services (SOAP, REST), and data formats (XML/JSON). - Experience with programming platforms (AWS, Azure, Google Cloud). - Knowledge of data integration and ETL tools. - Familiarity with Agile
integrate testing with other modules. - Data Management: - Prepare Master Data templates (Material Master, Vendor objects. - Documentation and Training: - Prepare user manuals and conduct training for business process owners (CR's), write Functional Specifications, prepare test data, test CR's, and prepare test results. - Regression
providing input on benefits and risks. Prepare test data and support testing activities, including unit testing and conduct training for business process owners. Prepare cut-over strategy, including data migration planning
Description: AWS Data Engineer (Expert) Join our dynamic IT Hub in South Africa as an AWS Data Engineer (Expert) and manage cutting-edge Big Data pipelines, ensuring secure and efficient data sharing. Work with top technologies collaborate in agile teams to deliver impactful data solutions. Ideal candidates have a relevant degree Develop and maintain data pipelines using AWS technologies. Implement and manage Big Data solutions. Collaborate Collaborate within Agile teams to deliver high-quality data solutions. Essential Skills: Proficiency in Terraform
Implement Java 8, Spring Framework (Spring Boot, MVC, Data/JPA, Security), and RESTful web services for backend Prepare technical documentation and provide user training. Required Skills: - Extensive experience with proficiency in Java 8, Spring Framework (Spring Boot, MVC, Data/JPA, Security), and RESTful web services. - Hands-on
provided documentation approach - Ensuring required data collection sheets are filled out - Considering business for defect resolution - Planning and ensuring training is conducted - Ensuring migration mapping workshops
objects. Preparing User manuals and conducting training for business process owners. Go-live preparation Functional Specifications for them. Preparing test data for testing CRs (Change Requests). Testing CRs (Change
you'll leverage your expertise in Python programming, data intelligence, and business intelligence to develop solutions. Key responsibilities include building data pipelines, supporting business intelligence processes Responsibilities: Develop and maintain data solutions using Python, focusing on data pipelines (ETL, SQL). Support of concepts (POCs). Required Skills: Expertise in Data Intelligence and Business Intelligence. Proficiency Proficiency in Python programming and building big data pipelines (ETL, SQL). Knowledge of Qlikview and awareness