excellence including automation and proactive monitoring approaches Collaborate with an international hardening (e.g., Ivanti) Worked with Application Monitoring tools. NB: By applying for this role, you consent management) ESSENTIAL SKILLS: Experience in Azure Monitoring Tools Experience in VMWare Horizon Cloud prefereably troubleshooting and patch management Experience in Azure Monitoring Tools Experience in VMWare Horizon Cloud prefereably Automated problem remediation Machine learning (ML) for data analytics Network troubleshooting skills Understand
Our client requires the services of a Data Scientist/Engineer (Entry) – Midrand/Menlyn/Rosslyn/Home Office
as possible ROLE: Data Engineers are responsible for building and maintaining Big Data Pipelines using using GROUP Data Platforms. Data Engineers are custodians of data and must ensure that data is shared in line Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”) Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH) GROUP CDEC Blueprint
Our client requires the services of a Data Scientist. POSITION : Contract role for 3 months. REQUIREMENTS: experience: 4 - 6 yrs. Cost variance use case build. Data Science.
Responsibilities Leverage Azure Data Factory to orchestrate and automate data workflows, ensuring seamless perform complex data analytics, employing both Azure and Python notebooks for scalable data processing. PowerBI PowerBI - Transform raw data into compelling visual stories with Power BI, providing actionable insights Python Programmer: Use Python within Azure to manage data, automate tasks, and build machine learning models protect data within Azure, using tools and practices that prevent unauthorized access. Monitor and control
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL software development ESSENTIAL SKILLS: Expertise in Data Intelligence and Business Intelligence Knowledge experience At least 3 years' experience building big data pipelines (ETL, SQL, etc) ADVANTAGEOUS TECHNICAL Assisting with the business case Planning and monitoring Eliciting requirements Requirements organisation
environments to build a data base access monitoring solution for PostgreSQL data bases In-depth knowledge logging and monitoring. Written and oral communication and presentation skills. Knowledge of data modelling modelling and data visualisation tools. Cloud Experience: Familiarity with cloud platforms like Amazon Web Services
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL ADVANTAGEOUS SKILLS Data and API Mining Knowledge on Security best practices Advanced monitoring of systems, setting up alerting pipelines. Be comfortable with Data Structures and Algorithms Understanding of integration Docker container creation and usage Familiar with data streaming services such as Apache Kafka Coordination Knowledge of Jira, Confluence and Agile methodologies Data Analysis ITSM knowledge User support ticket management
timely order fulfilment and delivery. Track and monitor order status to ensure on-time delivery and customer consistent messaging and branding. Data Management: Compile and analyze sales data, prepare sales reports, and effectively. Strong attention to detail and accuracy in data entry and document preparation. Ability to work
timely order fulfilment and delivery. Track and monitor order status to ensure on-time delivery and customer consistent messaging and branding. Data Management: Compile and analyze sales data, prepare sales reports, and effectively. Strong attention to detail and accuracy in data entry and document preparation. Ability to work