This role is for a Senior Data Engineer to help build and maintain scalable data pipelines and related architecture Document and effectively communicate data engineering processes and solutions. Bachelor's degree or Computer Science, IT, Engineering or, Mathematics 7 years' experience in a Data Engineering, High Performance Docker, Kubernetes, Hadoop, Kafka, Nifi or Spark or Cloud-based big data processing environments like Amazon data related programming, scripting or data engineering tools such as Python, R, Julia, T-SQL, PowerShell
Administrator / Systems Administrator / Network Engineer – 3 to 5 years Minimum: Relevant certifications and Security Ideal: Degree/Diploma in IT or IT Engineering, VMware Certified Professional (VCP) Proficient
Reference: JHB000245-KP-5 Gijima is a leader in Cloud, Outsourcing, Systems Integration, Healthcare, Human progressively responsible roles such as field technician, engineer, team lead. This hands-on experience provides leading and motivating a team of field technicians, engineers, and support staff. Regional or Area Management
Requirements: Bachelor's degree in Computer Science, Engineering, or related field (Master's degree preferred) Agile Certified Practitioner). Experience with cloud technologies (e.g., Azure, AWS) and microservices
Python or R and AI frameworks. Experience with cloud-based BI and AI solutions further enhances your Python or R and AI frameworks. Experience with cloud-based BI and AI solutions is desirable.
mentoring, and project management. Familiarity with cloud platforms and containerization, especially Azure
lead for Simphony Cloud and own the process of migration for largeenterprises into cloud. To follow up on
lead for Simphony Cloud and own the process of migration for largeenterprises into cloud. To follow up on
Python or R & AI Frameworks Experience with cloud based BI and AI solutions DUTIES: Enhance data strategy
stacks such as C#, React, and SQL, with exposure to cloud platforms like AWS and Azure. Familiarity with AI