Integration Engineer will be responsible for developing, testing, and implementing system integrations that connect Google Cloud). - Knowledge of data integration and ETL tools. - Familiarity with Agile and DevOps methodologies
Developer with expertise in C# and experience in data ETL/ELT, who will play a crucial part in designing, implementing Proven experience in designing and implementing data ETL/ELT pipelines using tools such as Azure Data Factory with version control systems (e.g., Git), unit testing, and CI/CD pipelines. Excellent problem-solving
possess strong analytical skills, experience with ETL processes, and a background in data modeling. Essential x - SQL (Oracle/PostgreSQL) - PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - PowerShell / skills for large and complex data sets - Thorough testing and data validation - Strong written and verbal
and maintain data mappings, transformations, and ETL processes to ensure data consistency and accuracy experience and usability Testing and Deployment Develop and execute test plans and test cases to validate MDM Experience with data integration technologies (e.g., ETL tools, APIs, web services) and relational databases
enterprise BI reporting and Extract, Transform, Load (ETL) processes and environments. Required Qualifications: warehouse/data mart design and configuration Use ETL tools to load data store(s)/data warehouse(s)/data
(Oracle/PostgreSQL)
- PySpark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- Thorough testing and data validation
- Strong written and verbal
Oracle/PostgreSQL
- PySpark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- Perform thorough testing and data validation to ensure the accuracy of data
which programs will be written, designed, coded, tested, and debugged.
- Strong organizational skills
3x - SQL - Oracle/PostgreSQL - PySpark - Boto3 - ETL - Docker - Linux / Unix - Big Data - PowerShell / large and complex data sets. - Perform thorough testing and data validation to ensure the accuracy of data which programs will be written, designed, coded, tested, and debugged. - Strong organizational skills.
Oracle/PostgreSQL
- Py Spark
- Boto3
- ETL
- Docker
- Linux / Unix
- Big Data
- Perform thorough testing and data validation to ensure the accuracy of data
which programs will be written, designed, coded, tested, and debugged.
- Strong organizational skills
Cloud).
- Knowledge of data integration and ETL tools.
- Familiarity with Agile and DevOps methodologies