JHB000383-BM-1 Our client requires the services of a Data Streaming Platform with Bachelor's Degree in Information
highly skilled and experienced Data Architect to join our team. As a Data Architect, you will be responsible responsible for designing, developing, and implementing data solutions that meet our organization's needs. You stakeholders to understand requirements, design data models, and ensure data integrity, security, and availability translate them into data architecture solutions. Design and develop scalable and efficient data models for relational non-relational databases. Define data standards, guidelines, and best practices to ensure data quality and consistency
co. za Data Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms Platforms. Data Engineers are custodians of data and must ensure that data is shared in line with the information Py Spark. Boto3. ETL. Docker. Linux / Unix. Big Data. Powershell / Bash. Salary Market Related
co. za Data Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms Platforms. Data Engineers are custodians of data and must ensure that data is shared in line with the information Py Spark. Boto3. ETL. Docker. Linux / Unix. Big Data. Powershell / Bash. Salary Market Related
the SAP, Java, Azure and Cloud landscape seeks a Data Streaming Platform Engineer. Running business critical
VDI Deployment. Understanding CA Data Management / Follow the Data as an VDI deployment approach. Driving dedicated ITSM service. Upon demand willing to work on weekends (changes). Operate and maintain scripts (deletion) Any operating system certification relating to data management. Any programming certification. Any web
Responsibilities: Analysing requirements Developing and capture test cases Executing the test cases Tracking and
Functional Specifications for them Preparing test data for testing of CR's (Change Requests) Testing CR's differences Willingness and ability to work on weekends and public holidays on implementation and operations
differences. Willingness and ability to work on weekends and public holidays on implementation and operations
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL calculations from a developer perspective. Expertise in Data Intelligence and Business Intelligence. Knowledge experience. At least 3 years' experience building big data pipelines (ETL, SQL, etc). Salary Market Related