leverage your skills in building and maintaining Big Data Pipelines using advanced cloud platforms. We PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms. They are custodians
Net Developer to join their team Become part of a big software development consulting house and get yourself
challenging days of the IT industry, today we enjoy a big community of trusted customers, employees, partners
growth.
Excel and MS Access proficiency Ability to work with big data Minimum 5 years' experience in an annuity /
Excel and MS Access proficiency Ability to work with big data Minimum 5 years' experience in an annuity /
Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data PowerShell / Bash Glue CloudWatch SNS Athena
challenging days of the IT industry, today we enjoy a big community of trusted customers, employees, partners
equivalent 5 years of experience in the DevOps space Big Data platforms (Vanilla Hadoop, Cloudera or Hortonworks)
Microservices Solid understanding of Golden Gate Plug-ins (Big Data / Kafka), GG Directors and Cloud Console Cloud