Required:
leverage your skills in building and maintaining Big Data Pipelines using advanced cloud platforms. We PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms. They are custodians
with 1-2 years post articles experience
ETL
- Docker
- Linux / Unix
- Big Data
- Powershell / Bash
- Cloud Data
Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms. They are custodians
Reference: NFP014149-SDW-1 Always wanted to be part of a big successful company - Now is your chance Job &
Experience and Skill Required: CA(SA) 1-2 years post Big 4 Articles Telecommunication background experience
Degree 5-8 years' progressive experience within a big data environment (Banking, Finance, Mobile Network
Required: CA(SA) with 1-2 years post articles experience Big 4 advantageous Advanced Excel Solid understanding
/>- Artificial intelligence, machine learning and big data technologies and architectures
- Management
awareness
- Ability to understand long-term (big picture) and short-term perspectives of events and
Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms. Data Engineers Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash ADVANTAGEOUS TECHNICAL SKILLS