Our client requires the services of a Data Scientist/Engineer (Entry) – Midrand/Menlyn/Rosslyn/Home Office project Ability and willingness to coach and give training to fellow colleagues and users when required Willing
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”) Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
Our client requires the services of a Data Scientist. POSITION : Contract role for 3 months. REQUIREMENTS: experience: 4 - 6 yrs. Cost variance use case build. Data Science.
mining client is currently seeking the services of a Data/Technical analyst to join their team for a project KEY PERFORMANCE AREAS ·View, analyse and compare data from mining site to provide valuable insights that that drive business decisions ·Translate data into reports and dashboards and communicate to stakeholders work with data sets from different data sources to collate information for stakeholders Data extraction extraction, capture and analysis skills Excellent presentation, communication skills and reporting skills Ability
and support test case creation. Coordinate test data creation with the developers and test analysts. feasibility. Technical Test Case creation. Clear defect capturing. Defect workflow adherence. Managing and communicating Identification, Creation & Sanitation of Test Data Manual & Automatic Test Execution. Maintenance
Writing test cases, Test Execution, and Defect capture. Planning and effort estimation for test case execution completion from a testing perspective Coordinate test data creation with the developers Track new/changed requirements
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL Meet with end users and gather requirements User training System testing/parallel runs System implementation Working Model (AWM) Charter ADVANTAGEOUS SKILLS Data and API Mining Knowledge on Security best practices setting up alerting pipelines. Be comfortable with Data Structures and Algorithms Understanding of integration Docker container creation and usage Familiar with data streaming services such as Apache Kafka Coordination
environments to build a data base access monitoring solution for PostgreSQL data bases In-depth knowledge and presentation skills. Knowledge of data modelling and data visualisation tools. Cloud Experience: project. Ability and willingness to coach and give training to fellow colleagues and users when required
input in terms of benefits and risks. Preparing test data for testing of user stories Execute and/or support and conducting training to business process owners Preparing cut-over strategy, e.g., data migration Go-Live project. Ability and willingness to coach and give training to fellow colleagues and users when required Willing
documentation Enable end user knowledge transfer and training Work closely with other functional consultants Service Management (ITSM) OPS Advanced Training OPS Basic Training Problem Management (PM) Release Planning Informatics understanding to resolve problems related to data flow between various systems & middleware Capability project Ability and willingness to coach and give training to fellow colleagues and users when required Willing