literate, able to work full-time or part-time shifts/weekends, and be immediately available. Minimum Requirements:
As an Oracle Cloud Data Engineer, you're stepping into a pivotal role where you'll be instrumental in harnessing the power of Oracle Cloud technology to drive data-driven solutions. Your expertise will be crucial designing, implementing, and optimizing data pipelines, ensuring seamless data integration and accessibility for transformative data initiatives. Together, we'll unlock the full potential of our data ecosystem and drive
sector. Get ready to dive into the dynamic world of data-driven solutions, where every insight you uncover Skills: SAP BW / SAP SAC. Data Modelling and data engineering skills. SAP BW 7.5 Data Modelling and BEX skills skills SAP BW4/HANA Data Modelling skills SAP BW4/HANA Query Modelling skills Any additional responsibilities
differences. Willingness and ability to work on weekends and public holidays on implementation and operations
Functional Specifications for them Preparing test data for testing of CR's (Change Requests) Testing CR's differences. Willingness and ability to work on weekends and public holidays on implementation and operations
tasks. a) Account Management · Achieve timeous PO capturing on Sage for AP to be able to accrue costs incurred
tasks. a) Account Management · Achieve timeous PO capturing on Sage for AP to be able to accrue costs incurred
SKILLS: SAP BW / SAP SAC. Data Modelling and data engineering skills. SAP BW 7.5 Data Modelling and BEX skills skills SAP BW4/HANA Data Modelling skills SAP BW4/HANA Query Modelling skills Any additional responsibilities beneficial. SAP BW-IP Knowledge is beneficial. SAP Data Intelligence skills is beneficial. Modules - SAP
An AWS Data Engineer with -12 years hands-on data engineering experience is required to join a team of Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data PowerShell / Bash Glue CloudWatch SNS Athena S3 Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”)
Ability to develop in Data Drive Programming languages such as Python and Big Data pipelines such as ETL Working Model (AWM) Charter ADVANTAGEOUS SKILLS Data and API Mining Knowledge on Security best practices setting up alerting pipelines. Be comfortable with Data Structures and Algorithms Understanding of integration Docker container creation and usage Familiar with data streaming services such as Apache Kafka Coordination Knowledge of Jira, Confluence and Agile methodologies Data Analysis ITSM knowledge User support ticket management