The Pipeline Engineer works as part of the SKA-Mid Computing and Software Telescope Operations team
operate and maintain the Mid telescope astronomy pipelines and other data analysis products such as the QA
QA matrix. Working with Senior Pipelines Developers and Platform developers, they identify, research
products.
*Documentation and testing of the pipeline and data processing software
*Support deployment
support for the SKA-Mid telescope data processing pipelines and software
*Participate in SKA-Mid Computing
products.
*Documentation and testing of the pipeline and data processing software
*Support deployment
support for the SKA-Mid telescope data processing pipelines and software
*Participate in SKA-MID Computing
and records, reports and statistics Advising Surveyors and other professionals on the data requirements
and records, reports and statistics Advising Surveyors and other professionals on the data requirements
solutions within a continuous integration and delivery pipeline. Soap and RESTFUL services Performance and load Understanding & Implementation of DevOps pipelines & tools. Hit Apply today
design and implementation practices and tooling(pipelines) in the cloud Expert knowledge of how to build continuous delivery, and continuous deployment pipeline with an emphasis on quality Good understanding
Programming languages such as Python and Big Data pipelines such as ETL, SQL, Spark etc. Relevant IT degree branching strategies and development approaches CI/CD pipeline development using Code Build and Github Actions Coverage Terraform and IAC deployments PySpark pipeline development Knowledge of multiple database types
target AWS accounts Enhance and optimize data pipelines, ETL processes, and ingestion workflows Implement code (IaC) in the target accounts Set up CI/CD pipelines using Azure DevOps for efficient deployment Ensure Experience with Python and its use in data engineering pipelines Solid understanding of CI/CD practices and Azure
your skills in building and maintaining Big Data Pipelines using advanced cloud platforms. We seek innovative APIs. - Experience building data pipelines using AWS Glue or Data Pipeline, or similar platforms. - Familiarity Function - Param Store - Secrets Manager - Code Build/Pipeline - CloudFormation - Business Intelligence (BI) responsible for building and maintaining Big Data Pipelines using Data Platforms. They are custodians of data
cloud DevOps skills (to also set up the CI/CD pipelines using Azure DevOps), and very strong Terraform everything will be related to data ingestion, storage, pipelines, ETL, catalogue, permissions, etc.