hiring an Entry-Level Data Scientist for our AI Platform team Use your skills in data analytics, machine Team as an Entry-Level Data Scientist Are you ready to kickstart your career in data science? The IT Hub Entry-Level Data Scientist to join our dynamic AI Platform team What You'll Do: - Leverage data analytics
Our client requires the services of a Data Scientist. POSITION : Contract role for 3 months. REQUIREMENTS: experience: 4 - 6 yrs. Cost variance use case build. Data Science.
mining client is currently seeking the services of a Data/Technical analyst to join their team for a project KEY PERFORMANCE AREAS ·View, analyse and compare data from mining site to provide valuable insights that that drive business decisions ·Translate data into reports and dashboards and communicate to stakeholders work with data sets from different data sources to collate information for stakeholders Data extraction extraction, capture and analysis skills Excellent presentation, communication skills and reporting skills Ability
of data warehousing techniques and approaches
tailored responses for respective Group After Sales departments per respective sites including Centurion, Cape up to a non-compliant organisation. Support departments, who are in the process of implementing governance • Ensure a continuous and stable integration of data flow between SAP ET2000, AutoPart and E-Parts systems network and ET 2000 key leads to ensure system health and compliance • Liaise and support relationships commitments. Good knowledge of the Occupational Health & Safety Management System (ISO45001); Energy
Description: AWS Data Engineer (Expert) Join our dynamic IT Hub in South Africa as an AWS Data Engineer (Expert) and manage cutting-edge Big Data pipelines, ensuring secure and efficient data sharing. Work with top technologies collaborate in agile teams to deliver impactful data solutions. Ideal candidates have a relevant degree Develop and maintain data pipelines using AWS technologies. Implement and manage Big Data solutions. Collaborate Collaborate within Agile teams to deliver high-quality data solutions. Essential Skills: Proficiency in Terraform
Develop and maintain data solutions using Python, focusing on data pipelines (ETL, SQL).
Support
/>Required Skills:
Expertise in Data Intelligence and Business Intelligence.
Proficiency
/>Proficiency in Python programming and building big data pipelines (ETL, SQL).
Knowledge of Qlikview
you'll leverage your expertise in Python programming, data intelligence, and business intelligence to develop solutions. Key responsibilities include building data pipelines, supporting business intelligence processes Responsibilities: Develop and maintain data solutions using Python, focusing on data pipelines (ETL, SQL). Support of concepts (POCs). Required Skills: Expertise in Data Intelligence and Business Intelligence. Proficiency Proficiency in Python programming and building big data pipelines (ETL, SQL). Knowledge of Qlikview and awareness
and Terraform. Experience in data-driven programming, SQL, and big data (PySpark). AWS certifications
and Execution Risk Management Cross-Functional Department Collaboration Stakeholder Communication: Qualifications