Our client requires the services of a Data Engineer/Scientist ( Expert) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV. Experience working with Data Quality Tools
Our client requires the services of a Data Scientist/Engineer (Entry) – Midrand/Menlyn/Rosslyn/Home Office
JHB001765-Laka-1 Join our dynamic IT Hub South Africa as a Data Scientist on the AI Platform team. You will leverage excellent teamwork abilities, and a passion for AI and data analytics. ESSENTIAL SKILLS REQUIREMENTS: - Strong
Reference: JHB001766-Laka-1 Looking for a dynamic Data Scientist to join our team at the forefront of AI
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”) Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
JHB001726-KK-1 Join our IT Hub South Africa as a Data Scientist-AI Platform, focusing on innovative AI
Advanced Data Streaming Platform Specialist. Focus on innovative IT solutions in SAP, Java, Cloud, Data Analytics and managing applications on Kubernetes clusters - Data modeling and Database technologies (relational,
2. Utilise AWS services and API Gateway to build scalable and efficient data pipelines.
3. Collaborate with cross-functional Qualifications:
as an AWS Data Engineer (Expert) and leverage your skills in building and maintaining Big Data Pipelines expertise in Python, SQL, and AWS services to drive our data initiatives forward. Be part of a collaborative where your contributions ensure accurate and secure data management. ESSENTIAL SKILLS REQUIREMENTS: - Technical ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - CDEC Blueprint - Any ADVANTAGEOUS SKILLS REQUIREMENTS: - Expertise in data modelling with Oracle SQL. - Exceptional analytical
specifications and timelines. Create data models, analyze data, and generate reports to provide relevant deploy Power BI scripts, perform deep data analysis, and optimize data retrieval using SQL queries. Track Indicators (KPIs), use visualizations for better data understanding, and improve ETL procedures to develop disabled candidates who hold a BSc Degree in IT, Data Science, or a related diploma. We are dedicated disabled candidates who hold a BSc Degree in IT, Data Science, or a related diploma. We are dedicated