as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Computer Science, Data Engineering or a comparable field of study with a focus on data intensive applications Experience in architecting and implementing scalable data pipelines in cloud environments, preferably Azure Azure. ESSENTIAL SKILLS: Programming skills in data related programming languages and frameworks, such as
business-led data and business intelligence consultancy that empowers organisations with data-driven decision-making environments. Our expert teams apply a mastery of data and technology to craft strategies that revolutionise role As a Data Engineer you will utilize your knowledge of Integrations, Automations, Data Management Management, Data Modeling. You will collaborate with the Business Intelligence team, Product Development teams teams, Cloud Operations teams, and the Data Engineering team to curate a high-quality dataset to serve your
Our client requires the services of a Data Engineer/Scientist (Senior) – Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with Enterprise Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV etc. Experience working with Data Quality
Our client requires the services of a Data Engineer/Scientist ( Expert) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV. Experience working with Data Quality Tools
Our client requires the services of a Data Engineer/Scientist ( Senior ) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Unix Big Data Powershell / Bash ADVANTAGEOUS TECHNICAL SKILLS Demonstrate expertise in data modelling complex data sets. Perform thorough testing and data validation to ensure the accuracy of data transformations
business-led data and business intelligence consultancy that empowers organisations with data-driven decision-making environments. Our expert teams apply a mastery of data and technology to craft strategies that revolutionise role As a Data Engineer you will utilize your knowledge of Integrations, Automations, Data Management Management, Data Modeling. You will collaborate with the Business Intelligence team, Product Development teams teams, Cloud Operations teams, and the Data Engineering team to curate a high-quality dataset to serve your
intuitive reporting solutions that visualize the data gathered from our cloud database footprint analysis platforms or other standard frameworks, to represent data on cloud database usage, costs, health, and security the Cloud Database Analyst team to understand the data and insights that need to be communicated to internal interfaces and visualizations that simplify complex data sets and allow users to explore and interact with Implement best practices in application development and data visualization to ensure accessibility, usability
SKILLS: SAP BW / SAP SAC. Data Modelling and data engineering skills. SAP BW 7.5 Data Modelling and BEX skills skills SAP BW4/HANA Data Modelling skills SAP BW4/HANA Query Modelling skills Any additional responsibilities beneficial. SAP BW-IP Knowledge is beneficial. SAP Data Intelligence skills is beneficial. Modules - SAP
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL software development ESSENTIAL SKILLS: Expertise in Data Intelligence and Business Intelligence Knowledge experience At least 3 years' experience building big data pipelines (ETL, SQL, etc) ADVANTAGEOUS TECHNICAL
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL Working Model (AWM) Charter ADVANTAGEOUS SKILLS Data and API Mining Knowledge on Security best practices setting up alerting pipelines. Be comfortable with Data Structures and Algorithms Understanding of integration Docker container creation and usage Familiar with data streaming services such as Apache Kafka Coordination Knowledge of Jira, Confluence and Agile methodologies Data Analysis ITSM knowledge User support ticket management