Company based in Pretoria is looking for a Software Engineer - Conversational AI to join their team on a contract as soon as possible. Relevant IT / Business / Engineering Degree. Candidates with one or more of the certifications Certified SysOps Associate, AWS Certified Developer Associate, AWS Certified Architect Associate, AWS Certified Architect Professional, Hashicorp Certified Terraform Associate. Clean ITC/credit record and no criminal record co. za Data Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms
Company based in Pretoria is looking for a Software Engineer - Conversational AI to join their team on a contract as soon as possible. Relevant IT / Business / Engineering Degree. Candidates with one or more of the certifications Certified SysOps Associate, AWS Certified Developer Associate, AWS Certified Architect Associate, AWS Certified Architect Professional, Hashicorp Certified Terraform Associate. Clean ITC/credit record and no criminal record co. za Data Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms
growing their data space and are seeking to appoint an innovative, adept, Data Engineer to join their their team of Data Experts. As a Data Engineer, you will be responsible for developing best practices and current workloads, optimize our data warehouse builds, data pipelines, and data products If this sounds like Azure) Azure Synapse Azure Data Lake Terraform ETL Data warehousing Data modelling Are You Qualified Degree (Computer Science, Engineering, or similar) 3 years in a Data Engineering role or related field The
for a highly analytical person with a knack for data analysis. Involving manipulation, modelling and between various data sources and business processes involving data flow. The Data Engineer will also be and building data management solutions. There will also be elements of integrating with data science tools visualise their data. Key Responsibilities & Accountabilities Construct end to end data service solutions Understand and manage the client's data requirements, the data being specific to the financial markets
Our client requires the services of a Data Engineer/Scientist ( Senior ) - Midrand/Menlyn/Rosslyn/Home as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line citizens/residents are preferred. Relevant IT / Business / Engineering Degree Candidates with one or more of the certifications Certified SysOps Associate AWS Certified Developer Associate AWS Certified Architect Associate AWS Certified
Development: Creation of services, reports and applications as per business requirements, and agreed deadlines. Support: Responsible for assisting IT and IS support personnel in solving technical problems. Documentation: Comprehensive documentation of new and existing systems at a technical level to
has a requirement for an intermediate DevOps Data Engineer.
Duties:
Azure and Cloud landscape seeks a Data Streaming Platform Engineer. Running business critical services
Job Title: AWS Data Engineers Required - Project Based - Contractual-to start asap-JHB - Hybrid Location: develop, and implement AWS cloud solutions for data engineering projects. 2. Utilise AWS services such as and API Gateway to build scalable and efficient data pipelines. 3. Collaborate with cross-functional Qualifications: Proven experience as an AWS Data Engineer or similar role Strong understanding of AWS and identifying data/job dependencies. Knowledge / Experience with data engineering using AWS platform
Requirements Solid 5 years of experience with Azure Data Factory, Databricks, and Power BI. Degree in Computer context of data manipulation and notebook-based analytics. A thorough understanding of cloud data security practices. Experience in optimizing cloud-based data platforms for high performance and reliability. especially those related to data engineering Responsibilities Leverage Azure Data Factory to orchestrate automate data workflows, ensuring seamless ETL processes. Utilize Databricks to perform complex data analytics