Job Title: AWS Data Engineers Required - Project Based - Contractual-to start asap-JHB - Hybrid Location: develop, and implement AWS cloud solutions for data engineering projects. 2. Utilise AWS services such and API Gateway to build scalable and efficient data pipelines. 3. Collaborate with cross-functional Required Qualifications: Proven experience as an AWS Data Engineer or similar role Strong understanding of scheduling and identifying data/job dependencies. Knowledge / Experience with data engineering using AWS platform
internationally for over 5000 clients Data Acquisition, Data cleansing, Data transformation. Machine learning learning algorithms to uncover data patterns, make predictions and build intelligent systems.
We are currently recruiting a Master Data Specialist (Finance) for a 6-12 month Degree
business units, that contribute to the monetisation of data through the development and delivery of advanced principles that find hidden patterns and associations in data used to influence business strategy, inform channel
graduated Data Analyst , the candidate will be responsible for working with company's data in various various business areas. Track and report data. Manage campaign budgets. Maintain a competitive market knowledge
practical experience and knowledge in the fields of Data Capturing and Processing. Van der Vyver Transport is
focus on delivering customised data warehousing design and development and data visualization to a global develop an understanding of business processes and data reporting needs. Critically analyze business requirements architect and implement design flows and data integration Develop data warehouse models and lineage documentation will be used by ETL developers to build out the data warehouse Produce technical/architectural specification documents Design and create ETL/ELT scripts for data warehouse and reporting tables Implement stored
Reference: CPT005206-RP-1 Data Engineer Our client is seeking a skilled Big Data Architect to design, implement involve developing data pipelines, ensuring data quality, and collaborating with data analysts to meet and implement data pipelines for efficient data movement and processing (ETL). Ensure data quality through and cleaning techniques. Build and manage scalable data storage and processing systems using distributed like Apache Hadoop. Collaborate with data analysts to understand data requirements and translate them into
translate business requirements into Data/IT requirements. Data extraction use multiple languages (Sql) CDEs (Critical Data Elements) source to target mapping (STTM), including knowledge of data dictionaries dictionaries Taxonomies Data Transformation Advanced data analysis using excel (Pivot tables) and the ability to Tools Data Remediation and associated activities: Data Remediation planning and tracking Data quality quality dashboards (design and implementation) Data Governance Compliance Knowledge of project assurance and
translate business requirements into Data/IT requirements. Data extraction use multiple languages (Sql) CDEs (Critical Data Elements) source to target mapping (STTM), including knowledge of data dictionaries dictionaries Taxonomies Data Transformation Advanced data analysis using excel (Pivot tables) and the ability to Tools Data Remediation and associated activities: Data Remediation planning and tracking Data quality quality dashboards (design and implementation) Data Governance Compliance Knowledge of project assurance and