Our client requires the services of a Data Engineer/Scientist ( Expert) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Knowledge of data formats such as Parquet, AVRO, JSON, XML, CSV. Experience working with Data Quality Tools
Our client requires the services of a Data Scientist/Engineer (Entry) – Midrand/Menlyn/Rosslyn/Home Office
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”) Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
Reference: JHB001787-GuguN-1 Job Description: The Data Scientist will focus on developing cost variance variance use cases as part of Exxaro Resources' Data & AI Strategy. This position entails working onsite statistical and machine learning techniques to drive data-driven decision making. - Present findings to both What We're Looking For: - Experience:4-6 years in data science roles. - Skills: Proficiency in Python, analysis and modeling techniques. Experience with data visualization tools (e.g., Tableau, Power BI). -
JHB001726-KK-1 Join our IT Hub South Africa as a Data Scientist-AI Platform, focusing on innovative AI
Requirements: Sound knowledge in Python and Java Data Science concepts and principles Experience in tools applications. Architecture and Interface Design. Data modeling and Database technologies (relational,
as an AWS Data Engineer (Expert) and leverage your skills in building and maintaining Big Data Pipelines expertise in Python, SQL, and AWS services to drive our data initiatives forward. Be part of a collaborative where your contributions ensure accurate and secure data management. ESSENTIAL SKILLS REQUIREMENTS: - Technical ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - CDEC Blueprint - Any ADVANTAGEOUS SKILLS REQUIREMENTS: - Expertise in data modelling with Oracle SQL. - Exceptional analytical
Join Our Funky Data Adventure Are you ready to dive into the groovy world of data management and revolutionize Qliksense mastery for visualizations that make data dance Get ready to rock 'n roll with: Collaborating existing processes like a backstage hero, ensuring data harmony across all departments. Empowering users purchasing, and supplier networks, becoming the MVP of data insights. You'll be at the forefront of: Maintaining you're ready to groove with the coolest cats in data management, hit us up with your CV and let's jam
sector. Get ready to dive into the dynamic world of data-driven solutions, where every insight you uncover Skills: SAP BW / SAP SAC. Data Modelling and data engineering skills. SAP BW 7.5 Data Modelling and BEX skills skills SAP BW4/HANA Data Modelling skills SAP BW4/HANA Query Modelling skills Any additional responsibilities
integrations that connect various software applications and data sources. This role involves collaborating with cross-functional document processes, and maintain compliance with data security regulations. Additionally, they will provide knowledge of APIs, web services (SOAP, REST), and data formats (XML/JSON). - Experience with programming platforms (AWS, Azure, Google Cloud). - Knowledge of data integration and ETL tools. - Familiarity with Agile