various job boards and social media channels. Capture profiles into the Applicant Tracking System and
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL software development ESSENTIAL SKILLS: Expertise in Data Intelligence and Business Intelligence Knowledge experience At least 3 years' experience building big data pipelines (ETL, SQL, etc) ADVANTAGEOUS TECHNICAL
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL Working Model (AWM) Charter ADVANTAGEOUS SKILLS Data and API Mining Knowledge on Security best practices setting up alerting pipelines. Be comfortable with Data Structures and Algorithms Understanding of integration Docker container creation and usage Familiar with data streaming services such as Apache Kafka Coordination Knowledge of Jira, Confluence and Agile methodologies Data Analysis ITSM knowledge User support ticket management
artifacts and activities. Maintenance of customer base data. Engage with stakeholders of TLM Crisis Management practical experience of IT Infrastructure i.e., Data Centres, Networks, Servers, Storage, Platform, Middleware IT Process Governance Data Analysis – The ability to analyse and visualise data sets. (Excel, Power-BI
environments to build a data base access monitoring solution for PostgreSQL data bases In-depth knowledge and presentation skills. Knowledge of data modelling and data visualisation tools. Cloud Experience:
Ansible Experience with Business Intelligence and data visualization tools (e.g. AWS Quicksight) Experience workflows for database systems Experience with common data formats, e.g. YAML, JSON Experience in managing managing the integration of database systems, including data flow management Understanding of various database preferably in Python (e.g. to transform and share data between databases) Experience with restful APIs external interface partners Plan the integration and data flows between multiple CMDBs and network management
reporting, and communication • To ensure data is updated and that data integrity is maintained on all internal
reporting, and communication • To ensure data is updated and that data integrity is maintained on all internal
providing input on benefits and risks. Prepare test data and support testing activities, including unit testing process owners. Prepare cut-over strategy, including data migration planning. Support pre and post Go-Live
process partners and other departments and maintaining data consistency across departments and process partners Azure or AWS or SAP BTP knowledge PostgreDB Python data analysis Power apps or other low code tools SAP