languages such as Python and Big Data pipelines such as ETL, SQL etc. Strong working knowledge with software 3 years' experience building big data pipelines (ETL, SQL, etc) Advantageous Skills Requirements: Understanding
· Terraform; Python 3x; Py Spark
· Boto3; ETL
· Docker; Linux / Unix; Big Data; Powershell importance):
· Trino Distributed SQL queries.
· Glue (ETL Scripting)
· CloudWatch; SNS
· Athena;
responsible for driving, designing and building scalable ETL systems for a big data warehouse to implement robust methodology.
Python 3x SQL - Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Cloud
Terraform, Python 3x, SQL (Oracle/PostgreSQL), PySpark, ETL, Docker, Linux/Unix, and other relevant technologies Python 3x, SQL (Oracle/PostgreSQL), PySpark, Boto3, ETL, Docker, Linux/Unix, and Big Data technologies. Experience
designs. •Design of the Extract Transfer and Load (ETL) solution to get data into a data warehouse or data integrate complex data sources in both real time and ETL based data extraction mechanisms DIMENSIONS •Experience
/ Technology: Terraform Python 3x Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Basic importance): Trino Distributed SQL queries Glue (ETL Scripting) CloudWatch SNS Athena S3 Kinesis Streams
/ Technology: Terraform Python 3x Py Spark Boto3 ETL Docker Linux / Unix Big Data Powershell / Bash Basic importance): Trino Distributed SQL queries Glue (ETL Scripting) CloudWatch SNS Athena S3 Kinesis Streams
Perform data extraction, transformation, and loading (ETL) processes Create, test, and deploy Power BI scripts
Operating System: Windows, Linux and Unix BI/DWH/ETL Tools: Qlik, AWS Quicksight DBMS / RDBMS: Postgres