Duties:
Determine and define project scope and objectives Predict resources needed to reach objectives and manage resources in an effective and efficient manner Prepare budget based on scope of work and resource requirements Track project costs in order to meet budget Develop and manage a detailed project s
Network Specialist. You will be based at Plettenberg Bay Municipality where you will be expected to ensure datafin.com/job/desktop-support-engineer-plettenberg-bay/ OR e-mail a Word copy of your CV to chantelledatafin the job. Desktop, Support, Engineer, Plettenberg, Bay, Negotiable
Description Our client is seeking a highly skilled and experienced Security and Investigations Officer to join their team, based in East London. Successful candidate required to have a strong background in both security and investigations, with a proven track record of identifying and mitigating pot
department to help develop the strategy for long term Big Data platform architecture Document and effectively Engineering, High Performance Computing, Data Warehousing, Big Data Processing Strong Experience working with various Kubernetes, Hadoop, Kafka, Nifi or Spark or Cloud-based big data processing environments like Amazon Redshift BigQuery and Azure Synapse Analytics Experience of Big Data technologies such as Hadoop, Spark and Hive Julia, T-SQL, PowerShell Experience working with Big Data Cloud based (AWS, Azure etc) technologies is
in South Africa. Design, implement, and optimize Big Data Pipelines using AWS services. Ensure data integrity Responsibilities: - Design, implement, and optimize Big Data Pipelines using AWS services. - Ensure data and ETL. - Experience with Docker, Linux/Unix, and Big Data technologies. - Excellent communication and
deploying and supporting. F5 BIG-IP LTM F5 BIG-IP DNS F5 BIG-IP ASM F5 BIG-IP APM Infoblox DDI and CNS
deploying and supporting. F5 BIG-IP LTM F5 BIG-IP DNS F5 BIG-IP ASM F5 BIG-IP APM Infoblox DDI and CNS
Responsibilities:
- Design, implement, and optimize Big Data Pipelines using AWS services.
- Ensure
- Experience with Docker, Linux/Unix, and Big Data technologies.
- Excellent communication
pipelines from ingestion to consumption within a big data architecture, using Java, PySpark, Scala, Kafka from a wide variety of data sources using SQL , AWS big data technologies and Kafka CC.