Our client requires the services of a Data Scientist/Engineer (Entry) – Midrand/Menlyn/Rosslyn/Home Office project Ability and willingness to coach and give training to fellow colleagues and users when required Willing
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”) Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
Our client requires the services of a Data Scientist. POSITION : Contract role for 3 months. REQUIREMENTS: experience: 4 - 6 yrs. Cost variance use case build. Data Science.
Knowledgeable in basic Project Management protocols Data analytics or data modelling will be highly beneficial Technical cloud and mobile devices Software support and training Travel if required Market the products and services and add value Data related functions using Excel, Microsoft SQL or similar Assist with data analyses of of financial data Assist clients on Business Requirements and deliver solutions to solve their challenges
and support test case creation. Coordinate test data creation with the developers and test analysts. feasibility. Technical Test Case creation. Clear defect capturing. Defect workflow adherence. Managing and communicating Identification, Creation & Sanitation of Test Data Manual & Automatic Test Execution. Maintenance
Writing test cases, Test Execution, and Defect capture. Planning and effort estimation for test case execution completion from a testing perspective Coordinate test data creation with the developers Track new/changed requirements
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL Meet with end users and gather requirements User training System testing/parallel runs System implementation Working Model (AWM) Charter ADVANTAGEOUS SKILLS Data and API Mining Knowledge on Security best practices setting up alerting pipelines. Be comfortable with Data Structures and Algorithms Understanding of integration Docker container creation and usage Familiar with data streaming services such as Apache Kafka Coordination
environments to build a data base access monitoring solution for PostgreSQL data bases In-depth knowledge and presentation skills. Knowledge of data modelling and data visualisation tools. Cloud Experience: project. Ability and willingness to coach and give training to fellow colleagues and users when required
input in terms of benefits and risks. Preparing test data for testing of user stories Execute and/or support and conducting training to business process owners Preparing cut-over strategy, e.g., data migration Go-Live project. Ability and willingness to coach and give training to fellow colleagues and users when required Willing
documentation Enable end user knowledge transfer and training Work closely with other functional consultants Service Management (ITSM) OPS Advanced Training OPS Basic Training Problem Management (PM) Release Planning Informatics understanding to resolve problems related to data flow between various systems & middleware Capability project Ability and willingness to coach and give training to fellow colleagues and users when required Willing