Required:
– 8 years knowledge of analysis and take-on of big data / importing / exporting of data
Data Drive Programming languages such as Python and Big Data pipelines such as ETL, SQL. Strong working knowledge experience At least 3 years' experience building big data pipelines (ETL, SQL, etc) ADVANTAGEOUS TECHNICAL
Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms. Data Engineers Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with
Reference: NFP014149-SDW-1 Always wanted to be part of a big successful company - Now is your chance Job &
Experience and Skill Required: CA(SA) 1-2 years post Big 4 Articles Telecommunication background experience
Degree 5-8 years' progressive experience within a big data environment (Banking, Finance, Mobile Network
/>- Artificial intelligence, machine learning and big data technologies and architectures
- Management
awareness
- Ability to understand long-term (big picture) and short-term perspectives of events and
Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms. Data Engineers Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH)
Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms. Data Engineers Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash ADVANTAGEOUS TECHNICAL SKILLS