Our client requires the services of a Data Engineer/Scientist ( Expert) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV. Experience working with Data Quality Tools
Our client requires the services of a Data Scientist/Engineer (Entry) – Midrand/Menlyn/Rosslyn/Home Office
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”) Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
Our client requires the services of a Data Scientist. POSITION : Contract role for 3 months. REQUIREMENTS: experience: 4 - 6 yrs. Cost variance use case build. Data Science.
mining client is currently seeking the services of a Data/Technical analyst to join their team for a project KEY PERFORMANCE AREAS ·View, analyse and compare data from mining site to provide valuable insights that that drive business decisions ·Translate data into reports and dashboards and communicate to stakeholders work with data sets from different data sources to collate information for stakeholders Data extraction extraction, capture and analysis skills Excellent presentation, communication skills and reporting skills Ability
Writing test cases, Test Execution, and Defect capture. Planning and effort estimation for test case execution back to business upon request including reporting (types of defects, the cause, and the severity of the defect) completion from a testing perspective Coordinate test data creation with the developers Track new/changed requirements
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL pipeline development Knowledge of multiple database types such as structured and unstructured databases Linux Working Model (AWM) Charter ADVANTAGEOUS SKILLS Data and API Mining Knowledge on Security best practices setting up alerting pipelines. Be comfortable with Data Structures and Algorithms Understanding of integration Docker container creation and usage Familiar with data streaming services such as Apache Kafka Coordination
and support test case creation. Coordinate test data creation with the developers and test analysts. feasibility. Technical Test Case creation. Clear defect capturing. Defect workflow adherence. Managing and communicating Identification, Creation & Sanitation of Test Data Manual & Automatic Test Execution. Maintenance
tickets. The solution time is equal for both ticket types. There are no exceptions concerning public holidays environments to build a data base access monitoring solution for PostgreSQL data bases In-depth knowledge and presentation skills. Knowledge of data modelling and data visualisation tools. Cloud Experience:
experience. COMMENCEMENT: As soon as possible Testing of Type Approval Management features via automation. Collaborate management and maintenance and preparation of test data. Interpretation of Testing Results and logging of Frontend, Backend and Integration testing. Test data management. Manual, Performance, security and load Identification, Creation & Sanitation of Test Data Security and Reliability Testing. Technical Test