ETL
- Docker
- Linux / Unix
- Big Data
- PowerShell / Bash
- Enterprise Collaboration
Experience with data formats: Parquet, AVRO, JSON, XML, CSV
- Experience with Data Quality Tools (e
Expertise in data modeling (Oracle SQL)
- Strong analytical skills for large data sets
- Testing
Testing and data validation experience
- Precise documentation skills
- Self-driven with ability
independently and in teams
- Experience with AWS Glue, Data Pipeline, or similar
- Familiarity with AWS
creation of databases, planning and maintenance of data configuration, implementation and enforcement of of Automated data archiving Automated data compression Automated table partitioning Data restoration Oracle allocated priority base timeline Generate audit and user/environment review based reports. Generate capacity forecasting reports. Generate any required adhoc based report where needed. Qualifications and Experience
/>
Essential Skills Requirements:
- Basic knowledge of ITIL and ITSM such as Problem-Incident-Change Management processes
- Deep knowledge of Confluence, Jira, ability to understand and transfer requirements into User Stories
- Professional communication and documentatio
Reference: JHB001890-Laka-1 We are seeking an Expert ServiceNow Developer specializing in Reporting and
ESSENTIAL SKILLS REQUIREMENTS:
- Practical knowledge in one or more of the following:
- ServiceNow development (UI Builder, Flows, Performance Analytics, Reporting, Platform Analytics, Generative AI, Scripting in ServiceNow, Virtual Agent)
- Alternatively, sound Jav
COMMENCEMENT: As soon as possible ROLE: Responsible for CA Data Management of central sector and all plants and guidelines Consult users how to store the CA data and permanent data archives as well as run reports Assign (Windows only) Create CA data structure as per GROUP guidelines Rearrange CA data structure as per project obsolete CA data using web application CA Cleanup Tool (Unix only) Backup and restore CA data Allow users work on weekends (changes) Serve as contact for data management related questions in the cloud Serve
the services of a Technology Integrator (Chief Expert) – Midrand/Menlyn/Rosslyn/Home Office Rotation as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”)
Requirements: Sound knowledge in Python and Java Data Science concepts and principles Experience in tools applications. Architecture and Interface Design. Data modeling and Database technologies (relational,
Are you passionate about leveraging data-driven insights to shape the future of automotive technology technology? Our client is looking for a talented Data Scientist to join their AI Platform team and drive innovation