leverage your skills in building and maintaining Big Data Pipelines using advanced cloud platforms. We PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms. They are custodians
ensure timely and accurate payment of invoices Matric Big book value 3 years minimun in a supervisory role
Company Description:
Our client is a big player in pioneering business models, making them
roles
Previous experience with dealing with big suppliers
Procurement knowledge
Company Description:
They are a very big development house within South Africa and serves
modelling techniques
with 1-2 years post articles experience
with 1-2 years post articles experience
designing and building scalable ETL systems for a big data warehouse to implement robust and trustworthy knowledge of emerging trends across Data/Analytics (Big Data, Machine Learning, Deep Learning, AI)
style.
If youre keen to take the next big step in your career, and you fit the profile below