Our client requires the services of a Data Engineer/Scientist (Senior) – Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with Enterprise Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality
and managing applications on Kubernetes clusters Data modelling and Database technologies (relational
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Computer Science, Data Engineering or a comparable field of study with a focus on data intensive applications Experience in architecting and implementing scalable data pipelines in cloud environments, preferably Azure Azure. ESSENTIAL SKILLS: Programming skills in data related programming languages and frameworks, such as
Our client requires the services of a Data Engineer/Scientist ( Expert) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV. Experience working with Data Quality Tools
knowledge across all SAP modules Initial focus on master data. Ability to assist with problem identification and advantageous) SAP FIORI (Advantageous) SAP MDG Master Data Governance (Advantageous) ADVANTAGEOUS TECHNICAL speaking (advantageous) Flexibility to work some weekends / shifts or longer hours if required. Experienced
JSON, RFC, IDOCs) Workflow SAP S/4HANA skills Core Data Services & AMDP Database update programming management Solution architecture, design, and development Data Modelling Excellent debugging and troubleshooting agile environment Willing and able to work on weekends and public holidays. Willing to travel to internationally
Writing test cases, Test Execution, and Defect capture. Planning and effort estimation for test case execution completion from a testing perspective Coordinate test data creation with the developers Track new/changed requirements
modules for Integration requirements. Preparing test data and documentation, Conducting Unit tests, Regression Planning meetings etc. Ensure availability to work on weekends and public holidays when required Daily use of
knowledge (advantageous) Flexibility to work some weekends / shifts or longer hours if required. Experienced
helpdesk, operations, finance, HR and IT Prepare data and commentary as requested Attend training sessions proficient in G Suite Experience with working with data capturing Experience in the public education sector will (employer) Death and disability funds (employer) LTE data for work Annual leave: 20 days