months fixed term contract and contribute to their data-driven approach to financial management and decision-making of financial data. The ideal candidate will possess a strong background in finance, data analysis, and Qlik. Collaborate with finance teams to understand data needs and provide actionable insights to support decision-making. Design and implement data models, ETL processes, and data warehouses to facilitate comprehensive regular data validation and quality checks to ensure the accuracy and integrity of financial data. Provide
SQL technologies and a special focus on financial data reporting and analytics . This role involves working experience in BI Development, ETL, Analysis Services, and data warehousing within a Microsoft environment. Required experience in BI development and data warehousing with a focus on financial data. Strong analytical skills and Experience with financial systems and understanding their data and reporting frameworks. Knowledge of financial and Power BI solutions to meet specific financial data reporting and analysis requirements. BI Development:
Reference: JHB009406-BG-1 AWS Data Engineer ESSENTIAL SKILLS REQUIREMENTS: Exceptional experience/understanding Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with Enterprise Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality complex data sets. - Perform thorough testing and data validation to ensure the accuracy of data transformations building data pipeline using AWS Glue or Data Pipeline, or similar platforms. - Familiar with data store
of Research Data Infrastructure. You will need to work with a wide range of scientific data and be able several front-end applications catering to different data types, both quantitative and qualitative. In addition client-created data in a database and APIs to support sensible models for client-server data exchanges. These databases using SQL. Experience working with JSON data and JSON APIs. Experience in the use of Version visualising geospatial data. Python and/or R Programming experience. Exposure to Data Science and Statistical
reusable code.
role in designing, developing, and optimizing our data infrastructure for e-commerce payment systems. You
requirements into scalable SQL solutions, ensuring data integrity, performance, and reliability.
data models and database schemas
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL calculations from a developer perspective. Expertise in Data Intelligence and Business Intelligence. Knowledge experience. At least 3 years' experience building big data pipelines (ETL, SQL, etc). Salary Market Related
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL calculations from a developer perspective. Expertise in Data Intelligence and Business Intelligence. Knowledge experience. At least 3 years' experience building big data pipelines (ETL, SQL, etc). Salary Market Related
(bugs/deviation from requirements) using issue reports. Capture reported issue reports in the Fault Reporting and