department to help develop the strategy for long term Big Data platform architecture Document and effectively High Performance Computing, Data Warehousing, Big Data Processing Strong Experience working with various Kubernetes, Hadoop, Kafka, Nifi or Spark or Cloud-based big data processing environments like Amazon Redshift, BigQuery and Azure Synapse Analytics Experience of Big Data technologies such as Hadoop, Spark and Hive At Julia, T-SQL, PowerShell Experience working with Big Data Cloud based (AWS, Azure etc) technologies is advantageous
South Africa. Design, implement, and optimize Big Data Pipelines using AWS services. Ensure data integrity Responsibilities: - Design, implement, and optimize Big Data Pipelines using AWS services. - Ensure data integrity ETL. - Experience with Docker, Linux/Unix, and Big Data technologies. - Excellent communication and problem-solving
Responsibilities:
- Design, implement, and optimize Big Data Pipelines using AWS services.
- Ensure data
- Experience with Docker, Linux/Unix, and Big Data technologies.
- Excellent communication and
pipelines from ingestion to consumption within a big data architecture, using Java, PySpark, Scala, Kafka a wide variety of data sources using SQL , AWS big data technologies and Kafka CC.
Scientist Dive into cutting-edge projects, analyze Big Data, and craft actionable insights. Whether you're Design, develop, and maintain projects using Big Data technologies. Utilize machine learning, data mining technical and client interactions Technologies: Big Data technologies (e.g., Hadoop, Spark) Machine learning
Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms. Data Engineers Oracle/PostgreSQL. Py Spark. Boto3. ETL. Docker. Linux / Unix. Big Data. Powershell / Bash. Salary Market Related
Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms. Data Engineers Oracle/PostgreSQL. Py Spark. Boto3. ETL. Docker. Linux / Unix. Big Data. Powershell / Bash. Salary Market Related
The individual must be comfortable working with Big Data and clean, analyse, interpret and report back Consulting services to customers in the form of Big Data Analytics, Supply Chain Opportunity Assessments
/>Experience in Project Management
Good knowledge on big data and how to use this effectively to streamline
Good programming skills, specifically for big data and automation of reports
Great at working