Kafka Engineer Job Profile
Kafka engineer is a big data engineer who specializes in developing and managing pipelines. You will also be required to work with other big data technologies such as Hadoop, Spark, and Storm Kafka-based data pipelines
trends and industry developments to identify cost-saving opportunities. Supplier Management: Evaluate supplier analysis on procurement activities, including cost savings and supplier performance. Process Management: Manage
Description:
Our client is a big player in pioneering business models making them
foreign exchange. If you are ready to make your next big career move, please apply directly. Job Experience
cloud platforms ( AWS, Azure, Google Cloud) and big data technologies (Hadoop, Spark)
10 Years experience managing complex projects Savings and Investments experience is required Accounting
adhered to Matric and a Tertiary Qualification Savings and Investments experience is ideal Tax experience
Improve processes for maximum efficiency and savings.
ng>
Sourcing and Supplier manager to identify cost saving projects
Â
Requirements:
National