Our client requires the services of a Data Scientist/Engineer (Entry) – Midrand/Menlyn/Rosslyn/Home Office users when required Willing and able to travel internationally
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”) Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
Our client requires the services of a Data Scientist. POSITION : Contract role for 3 months. REQUIREMENTS: experience: 4 - 6 yrs. Cost variance use case build. Data Science.
Responsibilities Leverage Azure Data Factory to orchestrate and automate data workflows, ensuring seamless perform complex data analytics, employing both Azure and Python notebooks for scalable data processing. PowerBI PowerBI - Transform raw data into compelling visual stories with Power BI, providing actionable insights Python Programmer: Use Python within Azure to manage data, automate tasks, and build machine learning models models. Apply rigorous security measures to protect data within Azure, using tools and practices that prevent
and support test case creation. Coordinate test data creation with the developers and test analysts. feasibility. Technical Test Case creation. Clear defect capturing. Defect workflow adherence. Managing and communicating Identification, Creation & Sanitation of Test Data Manual & Automatic Test Execution. Maintenance
Writing test cases, Test Execution, and Defect capture. Planning and effort estimation for test case execution completion from a testing perspective Coordinate test data creation with the developers Track new/changed requirements
artifacts and activities. Maintenance of customer base data. Engage with stakeholders of TLM Crisis Management practical experience of IT Infrastructure i.e., Data Centres, Networks, Servers, Storage, Platform, Middleware IT Process Governance Data Analysis – The ability to analyse and visualise data sets. (Excel, Power-BI travel extensively, for up to 2 weeks at a time (international). Excellent interpersonal and organizational
business intelligence. Design and map data models to shift raw data into meaningful insights. Utilize Power with apt objectives Analyse previous and present data for better decision-making Transform business requirements publications Build multi-dimensional data models Develop strong data documentation about algorithms, parameters Power BI Define and design new systems Take care of data warehouse development Make essential technical and
management and maintenance and preparation of test data. Interpretation of Testing Results and logging of Perform continuous quality analysis and provide internal tools to help ensure that the quality of the products Frontend, Backend and Integration testing. Test data management. Manual, Performance, security and load Identification, Creation & Sanitation of Test Data Security and Reliability Testing. Technical Test
testing documents and perform internal testing Preparation of Master Data templates for various objects Functional Specifications for them Preparing test data for testing of CRs (Change Requests) Testing CRs approach SAP ABAP Development on ECC and S/4HANA Data services BAPIs Eclipse IDE SAP Web IDE SAP UI5 (simple