/>- ETL
- Docker
- Linux / Unix
- Big Data
- PowerShell / Bash
- Enterprise Collaboration
Hub are responsible for building and maintaining Big Data Pipelines using Data Platforms. They ensure data
data engineering Proficiency in SQL, Python, and big data tools Strong problem-solving skills and a passion
Responsibilities: Responsible for building and maintaining Big Data Pipelines using Data Platforms Custodians of data
Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data PowerShell / Bash Glue CloudWatch SNS Athena S3
Paracon Delivery Wrangling (big) data from multiple sources into a reliable asset that is aimed at evolution
Cloud infrastructure Active working experience in Big Data is ideal Experience in market research, marketing
Cloud infrastructure Active working experience in Big Data is ideal Experience in market research, marketing
(AWS, Azure, or Google Cloud) – will be a big advantage. Data analysis background is beneficial. Key Responsibilities: