investigating possible improvements, and building new ETLs. Scope: Daily call management to address any issues environment and models (SLA management). Building new ETLs to support the integration of additional data sources
and implement alerts on data. Develop reports and ETL packages in line with Business Intelligence best warehousing. Will be responsible for developing ETL packages and data warehousing, as well as develop
with a minimum of 5 years relevant experience in ETL,SQL, Data Cleansing
Respon
Perform data extraction, transformation, and loading (ETL) operations Analyze complex data sets to identify in a similar BI Analyst role Strong knowledge of ETL data flow processes Demonstrable experience in identifying
Requirements: · Terraform; Python 3x; Py Spark · Boto3; ETL · Docker; Linux / Unix; Big Data; Powershell / Bash importance): · Trino Distributed SQL queries. · Glue (ETL Scripting) · CloudWatch; SNS · Athena; S3 · Kinesis
languages such as Python and Big Data pipelines such as ETL, SQL. Strong working knowledge with software development 3 years' experience building big data pipelines (ETL, SQL, etc) ADVANTAGEOUS TECHNICAL SKILLS Understanding
mapping, field mapping and value mapping for the ETL process. Ability to manage, facilitate, and drive
with a minimum of 5 years relevant experience in ETL,SQL, Data Cleansing Responsibilities Data profiling
languages such as Python and Big Data pipelines such as ETL, SQL etc. Strong working knowledge with software
from data sets and identifying trends and patterns; ETL Framework – This includes an understanding of data