improvement efforts.
improvement efforts. Design and develop data models, ETL processes, and pipelines to support data ingestion Design and implement data quality and data validation processes to ensure the accuracy and consistency lineage, and data flow. Design and develop data models, ETL processes, and pipelines to support data ingestion Design and implement data quality and data validation processes to ensure the accuracy and consistency
Quality and Consistency Implement robust data validation processes to maintain high-quality data. Establish retention guidelines. Regularly monitor data sources and processing environments, resolving inconsistencies Azure DevOps. Experience with large-scale data processing using Apache Spark. Knowledge of GCP and AWS
degree/diploma in computer science, statistics or data processing related qualification would be beneficial.
learn and adopt new technologies to improve data processes and infrastructure
Integrate new data sources into company databases for analysis, inclusion in relevant processes, and ERP integration integration. Embed data into workflow processes to drive continuous improvement and optimization. Participate and quality. Conduct root cause analysis on data and processes to address business questions and identify improvement. Enhance and automate data delivery and quality in ETL processes. Guide and mentor junior team
reconciliations and ensure accuracy of financial data Process accounts payable and receivable transactions
receipt of order, generate invoice and weighing data processing; ● Ensuring price conditions are copied from
fields Excellent quantitative analysis and data processing skills, proficient in Excel and financial analysis
Auditing/Finance/Accounting ● Basic knowledge in Electronic Data Processing ● Have 2 to 5 years Auditing experience ● Have