and managing applications on Kubernetes clusters Data modelling and Database technologies (relational
Our client requires the services of a Data Scientist/Engineer ( Advanced) - Midrand/Menlyn/Rosslyn/Home COMMENCEMENT: As soon as possible ROLE: Develop data science solutions and integrate and scale these development, deployment, and operations support for Big Data solutions and infrastructure. Managing the expectations business processes. Analyses the data and build reports by using data visualization tools such as Qlik skills. 3 – 5 years of experience in the field of Data Science ESSENTIAL SKILLS: Strong background in mathematics
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Computer Science, Data Engineering or a comparable field of study with a focus on data intensive applications Experience in architecting and implementing scalable data pipelines in cloud environments, preferably Azure Azure. ESSENTIAL SKILLS: Programming skills in data related programming languages and frameworks, such as
Our client requires the services of a Data Engineer/Scientist ( Expert) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV. Experience working with Data Quality Tools
Our client requires the services of a Data Engineer/Scientist (Senior) – Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with Enterprise Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality
Work rotation spans over morning, evening, and weekend shifts, also on public holidays Working knowledge Informatics understanding to resolve problems related to data flow between various systems & middleware Capability tasks in the feature team Flexibility to work weekends and after hours in a shift schedule IT experience
knowledge across all SAP modules Initial focus on master data. Ability to assist with problem identification and advantageous) SAP FIORI (Advantageous) SAP MDG Master Data Governance (Advantageous) ADVANTAGEOUS TECHNICAL speaking (advantageous) Flexibility to work some weekends / shifts or longer hours if required. Experienced
Work rotation spans over morning, evening, and weekend shifts, also on public holidays Provide general Informatics understanding to resolve problems related to data flow between various systems & middleware Strong tasks in the feature team Flexibility to work weekends and after hours in a shift schedule IT experience and able to travel internationally Working on weekends
JSON, RFC, IDOCs) Workflow SAP S/4HANA skills Core Data Services & AMDP Database update programming management Solution architecture, design, and development Data Modelling Excellent debugging and troubleshooting agile environment Willing and able to work on weekends and public holidays. Willing to travel to internationally
Writing test cases, Test Execution, and Defect capture. Planning and effort estimation for test case execution completion from a testing perspective Coordinate test data creation with the developers Track new/changed requirements