Integration Engineer will be responsible for developing, testing, and implementing system integrations that connect Google Cloud). - Knowledge of data integration and ETL tools. - Familiarity with Agile and DevOps methodologies
Developer with expertise in C# and experience in data ETL/ELT, who will play a crucial part in designing, implementing Proven experience in designing and implementing data ETL/ELT pipelines using tools such as Azure Data Factory with version control systems (e.g., Git), unit testing, and CI/CD pipelines. Excellent problem-solving
possess strong analytical skills, experience with ETL processes, and a background in data modeling. Essential x - SQL (Oracle/PostgreSQL) - PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - PowerShell / skills for large and complex data sets - Thorough testing and data validation - Strong written and verbal
technologies such as Kafka, AWS Cloud technologies and ETL Tools KPAs: Design and develop data warehousing and data lake solutions Ensure data accuracy Implement ETL processes using tools such as AWS Glue, Azure Data
/>Experience with data engineering tools such as ETL Tools, Data Pipeline Tools, Data Warehousing Tools
and maintain data mappings, transformations, and ETL processes to ensure data consistency and accuracy experience and usability Testing and Deployment Develop and execute test plans and test cases to validate MDM Experience with data integration technologies (e.g., ETL tools, APIs, web services) and relational databases
enterprise BI reporting and Extract, Transform, Load (ETL) processes and environments. Required Qualifications: warehouse/data mart design and configuration Use ETL tools to load data store(s)/data warehouse(s)/data
relevant insights and aid decision-making. Develop, test, and deploy Power BI scripts, perform deep data visualizations for better data understanding, and improve ETL procedures to develop new systems. Is this job for
(Oracle/PostgreSQL)
- PySpark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- Thorough testing and data validation
- Strong written and verbal
Oracle/PostgreSQL
- PySpark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- Thorough testing and data validation to ensure the accuracy of data
specifications for programs to be written, designed, coded, tested, and debugged.
- Strong organizational skills