Responsibilities: Analysing requirements Developing and capture test cases Executing the test cases Tracking and
ensure comprehensive engagement and accurately capture essential functional requirements.
an Expert AWS Data Engineer in South Africa. Design, implement, and optimize Big Data Pipelines using using AWS services. Ensure data integrity, security, and compliance while collaborating with teams to integrate expertise in AWS technologies and data engineering to drive impactful data solutions. Key Responsibilities: implement, and optimize Big Data Pipelines using AWS services. - Ensure data integrity, security, and compliance documentation. - Conduct thorough testing and validation of data solutions. - Stay updated with industry trends and
candidate, you will be responsible for data analysis, reporting, data integration, and empowering the Control closely with the IT development team to optimize data processes and enhance overall operational efficiency efficiency. Duties & Responsibilities Include: Data Analysis and Reporting: Develop SQL queries and BI reports to extract actionable insights from our data. Collaborate with the Control Tower team to understand delivery. Data Integration: Design and maintain data integration processes, consolidating data from various
SQL technologies and a special focus on financial data reporting and analytics . This role involves working experience in BI Development, ETL, Analysis Services, and data warehousing within a Microsoft environment. Required experience in BI development and data warehousing with a focus on financial data. Strong analytical skills and Experience with financial systems and understanding their data and reporting frameworks. Knowledge of financial and Power BI solutions to meet specific financial data reporting and analysis requirements. BI Development:
Reference: JHB009406-BG-1 AWS Data Engineer ESSENTIAL SKILLS REQUIREMENTS: Exceptional experience/understanding Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with Enterprise Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality complex data sets. - Perform thorough testing and data validation to ensure the accuracy of data transformations building data pipeline using AWS Glue or Data Pipeline, or similar platforms. - Familiar with data store
PYTHON BI DEVELOPER / DATA SCIENTIST / DATA ANALYST OFFICE BASED POSITION (MONDAY - FRIDAY) - NO REMOTE added advantage) Data warehouse design Azure cloud computing – added advantage Azure data lake and database policies relating to data / reporting and confidentiality Understanding of data requirements Knowledge structure and where relevant data resides Envision optimal structure for reporting Data warehousing appropriate dataset portfolio Ability to update / improve current data for various functions Understanding of report requirements
of Research Data Infrastructure. You will need to work with a wide range of scientific data and be able several front-end applications catering to different data types, both quantitative and qualitative. In addition client-created data in a database and APIs to support sensible models for client-server data exchanges. These databases using SQL. Experience working with JSON data and JSON APIs. Experience in the use of Version visualising geospatial data. Python and/or R Programming experience. Exposure to Data Science and Statistical
reusable code.
role in designing, developing, and optimizing our data infrastructure for e-commerce payment systems. You
requirements into scalable SQL solutions, ensuring data integrity, performance, and reliability.
data models and database schemas