an Expert AWS Data Engineer in South Africa. Design, implement, and optimize Big Data Pipelines using using AWS services. Ensure data integrity, security, and compliance while collaborating with teams to integrate expertise in AWS technologies and data engineering to drive impactful data solutions. Key Responsibilities: implement, and optimize Big Data Pipelines using AWS services. - Ensure data integrity, security, and compliance documentation. - Conduct thorough testing and validation of data solutions. - Stay updated with industry trends and
debug and upgrade software Create security and data protection settings Build features and applications responsive design Write technical documentation Work with data scientists and analysts to improve software Front-end:
security infrastructure, ensuring the protection of data, resources and systems. The Information Security as ISO 27000-27004, to uphold system security and data protection. Coordinating with all departments and TCP/IP/Routers/Switches/Firewalls Conducting ICT (Data) Backup and Restore Server operating systems and
and development(Containerization : Docker/Podman). Data layer (JPA, Hibernate, Domain Object Model, XML/XSD pattern, MVC, etc). Spring Framework (MVC, Batch, Web, Data, Security). SonarQube. Build tools (Apache Ant,
TCP/IP/Routers/Switches/Firewalls; Conducting ICT (Data) Backup and Restore; Server operating systems and
Confluence, Python Architecture : Cloud, On-prem, hybrid, data modelling, SW-Architecture Design architectures
Warehouse (BW) systems. Ensure a continuous and stable data flow integration between SAP ET2000, AutoPart, and
server reporting services.
system. Ensure a continuous and stable integration of data flow between SAP ET2000, IDIS and E-Parts systems