Key Performance Areas:
Data and Flow Management:
Performance Areas: Data and Flow Management: Monitor and manage system backups and Evolution data synchronization and constant data flow (Regional and Local) with minimal system "Down days" and clean data maintenance backup procedures for data integrity and conduct successful recoveries. Respond to data loss incidents, implement administration qualification essential (e.g., SQL Server). Data analysis experience. Proven experience as an IT
integrations that connect various software applications and data sources. This role involves collaborating with cross-functional document processes, and maintain compliance with data security regulations. Additionally, they will provide knowledge of APIs, web services (SOAP, REST), and data formats (XML/JSON). - Experience with programming platforms (AWS, Azure, Google Cloud). - Knowledge of data integration and ETL tools. - Familiarity with Agile
functionality of DAM Components Make sure that the generated data is delivered to SPLUNK on a regular defined base on PostgreSQL Make sure that defined PostgreSQL data bases are monitored tight communication and cooperation cooperation with relevant interface partners Assist data base operations team application teams with DAM Tool. Make sure that DAM does not interfere with Data base functionality Engage with software providers least on a quarterly base Build a cloud solution for Data Base Activity monitoring Get known to different
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL software development ESSENTIAL SKILLS: Expertise in Data Intelligence and Business Intelligence Knowledge experience At least 3 years' experience building big data pipelines (ETL, SQL, etc) ADVANTAGEOUS TECHNICAL
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL Working Model (AWM) Charter ADVANTAGEOUS SKILLS Data and API Mining Knowledge on Security best practices setting up alerting pipelines. Be comfortable with Data Structures and Algorithms Understanding of integration Docker container creation and usage Familiar with data streaming services such as Apache Kafka Coordination Knowledge of Jira, Confluence and Agile methodologies Data Analysis ITSM knowledge User support ticket management
knowledge of APIs, web services (SOAP, REST), and data formats (XML/JSON).
- Experience with programming
(AWS, Azure, Google Cloud).
- Knowledge of data integration and ETL tools.
- Familiarity with
assessment policies. • Operate and maintain the health of the network security system architecture including
Create RESTful APIs Create data clients to consume RESTful APIs Integrate data storage solutions, including of Spring Boot Ability to script and manipulate data in JavaScript/Typescript (knowledge of a server-side
APIs