professionals What do you need: Qualified CA(SA) Big 4 articles 6-8 years post article experience 4 years
leverage your skills in building and maintaining Big Data Pipelines using advanced cloud platforms. We PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms. They are custodians
(Expert) and lead the development and maintenance of Big Data Pipelines using cutting-edge cloud platforms PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - PowerShell / Bash - Cloud Data Hub (CDH) - Engineers are responsible for building and maintaining Big Data Pipelines using Cloud Data Platforms. Data Engineers
AWS Data Engineer, where you'll build and maintain Big Data pipelines, leveraging cutting-edge technologies PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - PowerShell / Bash Advantageous Skills Requirements: Tasks and Responsibilities: - Build and maintain Big Data Pipelines using Data Platforms. - Ensure data
Experience & Skills Required:
a century they have aligned themselves with very big names within the FMCG, manufacturing and distribution
modelling techniques
designing and building scalable ETL systems for a big data warehouse to implement robust and trustworthy operations
/>- ETL
- Docker
- Linux / Unix
- Big Data
- PowerShell / Bash
Advantageous
Responsibilities:
- Build and maintain Big Data Pipelines using Data Platforms.
- Ensure
Expert) in South Africa You'll build and maintain Big Data Pipelines, ensuring data integrity and compliance Py Spark - Boto3 - ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms. Data Engineers