co. za Data Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms Platforms. Data Engineers are custodians of data and must ensure that data is shared in line with the information Py Spark. Boto3. ETL. Docker. Linux / Unix. Big Data. Powershell / Bash. Salary Market Related
co. za Data Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms Platforms. Data Engineers are custodians of data and must ensure that data is shared in line with the information Py Spark. Boto3. ETL. Docker. Linux / Unix. Big Data. Powershell / Bash. Salary Market Related
South African Citizens may apply. Drive Awareness: Evaluate real-time data, processes, user interactions workshops to capture user requirements and stories for project builds and sprints. Capture User Requirements:
SA ID can apply. 5 years of experience as an IT Data Governance Lead with application experience (SAP co .za The Data Governance Lead is responsible for ensuring the effective working of the data governance gather their data governance requirements, to understand their data, their (Meta) data needs and business Workout the enterprise data governance strategy and roadmap, aligned with the Chief Data Officer. Develop and and implement data governance framework, policies, data governance standards, data protection standards
ensure comprehensive engagement and accurately capture essential functional requirements. Model business business processes, business rules, and flow of data and messages to create detailed system representations
Understanding CA Data Management / Follow the Data as an VDI deployment approach. Driving possible VDI automation consulting / designing the optimal VDI solution. Driving CA VDI cloud strategy towards constant state of Any operating system certification relating to data management. Any programming certification. Any web
ensure comprehensive engagement and accurately capture essential functional requirements. Model business business processes, business rules, and flow of data and messages to create detailed system representations
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL
Specialist, on a hybrid basis. Must be willing to drive 750 KM to different sites a month on average. This other enterprise systems to facilitate seamless data exchange and interoperability. Performance Monitoring