Our client requires the services of a Data Engineer/Scientist ( Expert) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV. Experience working with Data Quality Tools
Our client requires the services of a Data Scientist/Engineer (Entry) – Midrand/Menlyn/Rosslyn/Home Office
architecture guidelines, including quality assurance. Monitoring and reporting of TLM activities in the Group artifacts and activities. Maintenance of customer base data. Engage with stakeholders of TLM Crisis Management practical experience of IT Infrastructure i.e., Data Centres, Networks, Servers, Storage, Platform, Middleware IT Process Governance Data Analysis – The ability to analyse and visualise data sets. (Excel, Power-BI
Integration security tests. System Monitoring including Idocs Monitoring and processing Following the Agile Functional Specifications for them Preparing test data for testing of CR's (Change Requests) Testing CR's
Developer and customer experience. Develop scripts/monitoring to analyse and visualise important metrics about about development processes. Develop scripts/monitoring to analyse and visualise important metrics about Access Management. Service Management: Incidents Monitoring, manage SLAs, problem management reporting, ITIL experience Architecture: Cloud, On-prem, hybrid, data modelling, SW-Architecture
business intelligence. Design and map data models to shift raw data into meaningful insights. Utilize Power with apt objectives Analyse previous and present data for better decision-making Transform business requirements publications Build multi-dimensional data models Develop strong data documentation about algorithms, parameters Power BI Define and design new systems Take care of data warehouse development Make essential technical and
excellence including automation and proactive monitoring approaches Collaborate with an international hardening (e.g., Ivanti) Worked with Application Monitoring tools Important: A clear criminal record is required Automated problem remediation Machine learning (ML) for data analytics Network troubleshooting skills Experience applications and connectivity Experience with monitoring systems Hybrid network design and implementation
Deployment Understanding GROUPs CA Data Management / “Follow the Data” as an VDI deployment approach. Driving art” architectural design Perform housekeeping/monitoring/managing of the VDI platforms focusing on stability/performance certification Any operating system certification relating to data management Any programming certification Any web
parts availability for G01 and G45 series Built. Monitor vessel status to identify delays so that action or A/F of parts from Coastal Areas into plant) Monitor Train status to ensure containers are received the correct transmission of weekly call-offs Monitoring and reporting of correct stock levels in plant performance and initiate overall improvements. Monitoring of open Advanced Shipping Notifications (ASN's)
DevOps, with a strong focus on infrastructure, monitoring, debugging and fault-finding Handling of incidents Azure networking, SQL, Azure Functions, Azure Monitor Kubernetes and Docker ITIL processes, in particular between development and support environments Monitoring Assist with identification and management of