leverage your skills in building and maintaining Big Data Pipelines using advanced cloud platforms. We PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - Powershell / Bash - Cloud Data Hub (CDH) - Engineers are responsible for building and maintaining Big Data Pipelines using Data Platforms. They are custodians
Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms. Data Engineers Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash ADVANTAGEOUS TECHNICAL SKILLS
Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms. Data Engineers Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash GROUP Cloud Data Hub (CDH)
Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data PowerShell / Bash Glue CloudWatch SNS Athena
Skills: General AWS experience in the DataScience / big data context. Docker, & Kubernetes Experience
a friend who is a technology specialist? We pay BIG CASH to you if we place a friend that you sent us
organization levels Communication methodology skills in big organisations/initiatives IT Infrastructure Operations
tasks at the same time without losing sight of the big picture. Problem-solving skills - develops and establishes
deliver a stable and secure solution. Working in big company environments with more than 100 000 Windows