and IT Methodology processes is recruiting for a Data Scientist – AI Platform to offer a deep insight
innovative financial services have an opportunity for a Data Engineer who will take responsibility for driving ETL systems for a big data warehouse to implement robust and trustworthy data to support high performing algorithms, predictive models and support real-time data visualisation requirements across the organisation Responsibilities: Systematic solution design of the ETL and data pipeline in line with business user specifications to the approved solution design Ensure data governance and data quality assurance standards are upheld
enterprises as well as dynamic start-ups is looking for a Data Engineer to join an environment of like-minded engineers Responsibilities: Build data warehouse components on Teradata warehouse Present data on dashboards and visualisations visualisations on Power BI Comply with all data governance, like documenting data definitions and business glossaries delivering a client project at a financial Build data systems and pipelines Evaluate business needs and and patterns Conduct complex data analysis and report on results Prepare data for prescriptive and predictive
SAP, JAVA, Azure, Cloud is looking for an Azure Data Engineer . The unique positioning gives a deep insight maintaining Big Data Pipelines using Data Platforms Custodians of data and must ensure that data is shared need-to-know basis Experience using programming skills in data related programming languages and frameworks, such Kusto Experience with Azure Data Solutions: Azure Data Factory, Azure Data Explorer, Azure Databricks Profound technical understanding for Data Engineering and Data Warehouse Design Familiar with modern
international expertise has a vacant position for an AWS Data Engineer to work within a team solving complex problems expertise in data modelling Oracle SQL Exceptional analytical skills analysing large and complex data sets Perform testing and data validation to ensure the accuracy of data transformations Building data pipeline using using AWS Glue or Data Pipeline, or similar platforms Preparing specifications from which programs will technical documentation and artefacts Working with Data Quality Tools such as Great Expectations Developing
Requirements: Sound knowledge in Python and Java Data Science concepts and principles Experience in tools applications. Architecture and Interface Design. Data modeling and Database technologies (relational,
will have the chance to showcase your skills in a data warehouse environment and have a direct impact on developing end-to-end data acquisition processes to be used in population of data warehouse/data marts and/or support the data acquisition development process. Design, develop and execute complex data acquisition B.Sc. or related degree is advantageous Relevant data warehouse and BI solution training is preferred database- and data warehouse modeling skills, in order to understand the data warehouse data models. Terra
views and triggers for efficient data processing and generate financial data required for revenue services financial authority. Responsibilities: Manage end to end data solutions through Microsoft BI Stack Create and and optimise advanced SQL queries for efficient data retrieval, manipulation and reporting. Liaise with Certifications in Business Intelligence and / or Data Engineering advantageous. Experience with Microsoft Development), SSAS (Data Cube. Mining), SSRS (Report Development) & Power BI (Data Visualisations).
performant applications Implementation of security and data protection Reference Number for this position is
equivalent 5 years of experience in the DevOps space Big Data platforms (Vanilla Hadoop, Cloudera or Hortonworks)