leverage your skills in building and maintaining Big Data Pipelines using advanced cloud platforms. We PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms. They are custodians
Net Developer to join their team Become part of a big software development consulting house and get yourself
growth.
Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data PowerShell / Bash Glue CloudWatch SNS Athena
equivalent 5 years of experience in the DevOps space Big Data platforms (Vanilla Hadoop, Cloudera or Hortonworks)
Microservices Solid understanding of Golden Gate Plug-ins (Big Data / Kafka), GG Directors and Cloud Console Cloud
Skills: General AWS experience in the DataScience / big data context. Docker, & Kubernetes Experience
management, development and capacity building is also a big part of this role plus problem-solving and decision-making
you aspire to dare for better? And to “shake up big brands that do good?” One of the ways that we make
you aspire to dare for better? And to “shake up big brands that do good?” One of the ways that we make