ETL
- Docker
- Linux / Unix
- Big Data
- PowerShell / Bash
Advantageous
Expertise in data modelling with Oracle SQL
- Analytical skills for large and complex data sets
-
/>- Thorough testing and data validation
- Strong written and verbal communication
- Ability
(Confluence, JIRA, etc.)
- Knowledge of Cloud Data Hub (CDH) and CDEC Blueprint
- Development
Knowledge of data formats (Parquet, AVRO, JSON, XML, CSV, etc.)
- Experience with Data Quality Tools
Linux / Unix
- Big Data
- PowerShell / Bash
- Cloud Data Hub (CDH)
- CDEC Blueprint
data formats such as Parquet, AVRO, JSON, XML, CSV, etc.
- Experience working with Data Quality
REQUIREMENTS:
- Demonstrated expertise in data modelling with Oracle SQL.
- Exceptional analytical
complex data sets.
- Perform thorough testing and data validation to ensure the accuracy of data transformations
multi-task.
- Experience building data pipelines using AWS Glue or Data Pipeline, or similar platforms.
/>- Linux / Unix
- Big Data
- Powershell / Bash
- Cloud Data Hub (CDH)
- CDEC Blueprint
- Technical data modelling and schema design ("not drag and
Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV, etc.
- Experience working with Data Quality
REQUIREMENTS:
- Demonstrate expertise in data modelling Oracle SQL.
- Exceptional analytical
complex data sets.
- Perform thorough testing and data validation to ensure the accuracy of data transformations
Hybrid in Gauteng Data integration expert needed for a Fund Accounting channel within a large Financial
/>3 + years of experience in financial data integration
Data management and Fund Accounting
Exp
fund accounting is highly desirable
Exp with data integration tools and software
Familiarity
ASISA
Instrument formats
Exp with data lakes and data manipulation techniques
SQL, Python
team as an AWS Data Engineer (Expert) and lead the development and maintenance of Big Data Pipelines using technologies, and ensure the integrity and quality of our data solutions. Make an impact in a role that blends ETL - Docker - Linux / Unix - Big Data - PowerShell / Bash - Cloud Data Hub (CDH) - CDEC Blueprint - Experience Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV, etc. - Experience working with Data Quality SKILLS REQUIREMENTS: - Demonstrated expertise in data modelling with Oracle SQL. - Exceptional analytical
Africa as a Senior AWS Data Engineer, where you'll build and maintain Big Data pipelines, leveraging cutting-edge Python, and AWS Glue. Drive innovation and ensure data integrity within a global enterprise environment experience with ETL processes, and a background in data modeling. Essential Skills Requirements: - Terraform Unix - Big Data - PowerShell / Bash Advantageous Skills Requirements: - Expertise in data modelling with Analytical skills for large and complex data sets - Thorough testing and data validation - Strong written and
as an AWS Data Engineer (Chief Expert) in South Africa You'll build and maintain Big Data Pipelines, ensuring data integrity and compliance. Leverage your expertise in AWS technologies, mentor junior engineers ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - CDEC Blueprint Basic Business Intelligence (BI) Experience - Technical data modelling and schema design (“not drag and drop”) Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV, etc. - Experience working with Data Quality
integrations that connect various software applications and data sources. This role involves collaborating with cross-functional document processes, and maintain compliance with data security regulations. Additionally, they will provide knowledge of APIs, web services (SOAP, REST), and data formats (XML/JSON). - Experience with programming platforms (AWS, Azure, Google Cloud). - Knowledge of data integration and ETL tools. - Familiarity with Agile
knowledge of APIs, web services (SOAP, REST), and data formats (XML/JSON).
- Experience with programming
(AWS, Azure, Google Cloud).
- Knowledge of data integration and ETL tools.
- Familiarity with
terms of benefits and risks.