Data Drive Programming languages such as Python and Big Data pipelines such as ETL, SQL etc. Strong working software development 3 - 5 year's experience building big data pipelines Expertise in Data Intelligence and
with Terraform, Python, SQL, and more. Dive into Big Data Pipelines, ETL magic, and cloud platforms. Bring Data Wizards What You'll Do: - Build and maintain Big Data Pipelines like a boss ? - Work your magic with
– 8 years knowledge of analysis and take-on of big data / importing / exporting of data
Data Drive Programming languages such as Python and Big Data pipelines such as ETL, SQL. Strong working knowledge experience At least 3 years' experience building big data pipelines (ETL, SQL, etc) ADVANTAGEOUS TECHNICAL
will be responsible for building and maintaining Big Data Pipelines using Data Platforms and must ensure Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Must have 3 – 5 years working
will be responsible for building and maintaining Big Data Pipelines using Data Platforms and must ensure Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Must have 3 – 5 years working
You'll Do:**
- Build and maintain Big Data Pipelines like a boss ?
- Work your magic
ETL
Engineers are responsible for building and maintaining Big Data Pipelines using GROUP Data Platforms. Data Engineers Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Experience in working with
Degree 5-8 years' progressive experience within a big data environment (Banking, Finance, Mobile Network