COMMENCEMENT: As soon as possible ROLE: Responsible for CA Data Management of central sector and all plants and guidelines Consult users how to store the CA data and permanent data archives as well as run reports Assign (Windows only) Create CA data structure as per GROUP guidelines Rearrange CA data structure as per project obsolete CA data using web application CA Cleanup Tool (Unix only) Backup and restore CA data Allow users work on weekends (changes) Serve as contact for data management related questions in the cloud Serve
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Mentor junior engineers. Demonstrate solution architect expertise. Be able to execute any Department related Developer Associate AWS Certified Architect Associate AWS Certified Architect Professional Hashicorp Certified Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
Our client requires the services of a Data Scientist/Engineer (Entry) – Midrand/Menlyn/Rosslyn/Home Office Glue. Operating System: Windows, Linux and Unix. BI/DWH/ETL Tools: Informatica 9.5/9.1/8.6, Tableau,
and managing applications on Kubernetes clusters Data modelling and Database technologies (relational process knowledge and work experience Understanding of BI Tools will be an advantage Assisting with the business
Our client requires the services of a Data Engineer/Scientist (Senior) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Developer Associate, AWS Certified Architect Associate, AWS Certified Architect Professional, Hashicorp Certified Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL AWS Certified Developer Associate / Solutions Architect (advantageous) ESSENTIAL SKILLS: Expertise in Working Model (AWM) Charter ADVANTAGEOUS SKILLS Data and API Mining Knowledge on Security best practices setting up alerting pipelines. Be comfortable with Data Structures and Algorithms Understanding of integration technologies Able to refine and plan stories and EPICs Architecting solutions to business solutions Experience with
targets and tracking and analysis. Collaborate with BI data team to ensure that dashboards are properly rolled rolled out in Active Collections and that data and reporting are relevant and fit for purpose. Contribute collections and recovery results by integrating models and data-driven approaches into operational processes. Ensure
Dialog Programming SapScript and Smartforms Batch Data Capture (BDC) Function Modules and BAPI's Enhancements Dialog Programming SapScript and Smartforms Batch Data Capture (BDC) Function Modules and BAPI's Enhancements Collaboration: Work closely with other developers, architects, and stakeholders to ensure seamless integration
generation Financial Reporting Software Packages(SAP, SAP-BI etc) Business Acumen The ability to deal with ambiguity
Java Knowledge of design patterns Knowledge of architecting and developing solutions for scalable, distributed