innovative financial services have an opportunity for a Data Engineer who will take responsibility for driving ETL systems for a big data warehouse to implement robust and trustworthy data to support high performing algorithms, predictive models and support real-time data visualisation requirements across the organisation Responsibilities: Systematic solution design of the ETL and data pipeline in line with business user specifications to the approved solution design Ensure data governance and data quality assurance standards are upheld
Number for this positio n is BRM59236 which is a Remote position offering a contract rate of up to R800
and IT Methodology processes is recruiting for a Data Scientist – AI Platform to offer a deep insight
enterprises as well as dynamic start-ups is looking for a Data Engineer to join an environment of like-minded engineers Responsibilities: Build data warehouse components on Teradata warehouse Present data on dashboards and visualisations visualisations on Power BI Comply with all data governance, like documenting data definitions and business glossaries delivering a client project at a financial Build data systems and pipelines Evaluate business needs and and patterns Conduct complex data analysis and report on results Prepare data for prescriptive and predictive
SAP, JAVA, Azure, Cloud is looking for an Azure Data Engineer . The unique positioning gives a deep insight Methodology processes within the group and offer internal clients an extended warranty due to their vested maintaining Big Data Pipelines using Data Platforms Custodians of data and must ensure that data is shared need-to-know basis Experience using programming skills in data related programming languages and frameworks, such Kusto Experience with Azure Data Solutions: Azure Data Factory, Azure Data Explorer, Azure Databricks
presence with regional and international expertise has a vacant position for an AWS Data Engineer to work within expertise in data modelling Oracle SQL Exceptional analytical skills analysing large and complex data sets Perform testing and data validation to ensure the accuracy of data transformations Building data pipeline using using AWS Glue or Data Pipeline, or similar platforms Preparing specifications from which programs will technical documentation and artefacts Working with Data Quality Tools such as Great Expectations Developing
Requirements: Sound knowledge in Python and Java Data Science concepts and principles Experience in tools applications. Architecture and Interface Design. Data modeling and Database technologies (relational,
will have the chance to showcase your skills in a data warehouse environment and have a direct impact on on the organisations bottom line through analysis and strategic thinking. Responsibilities: Plan and multi-discipline business opportunities. Conduct planning, analysis and design activities in conjunction with other other development specialists. Participate in analysis of complex business opportunities/problems to deliver developing end-to-end data acquisition processes to be used in population of data warehouse/data marts and/or
An exciting opportunity to join our international manufacturing client with head offices based in the scalable, catering for requirements for various international markets with functionality encapsulated in API's external support Troubleshooting deployments Debugging remote services Ability to read, interpret and follow
and Back-End Development, currently offering this remote work opportunity, you will be responsible for developing performant applications Implementation of security and data protection Reference Number for this position is