department to help develop the strategy for long term Big Data platform architecture Document and effectively High Performance Computing, Data Warehousing, Big Data Processing Strong Experience working with various Kubernetes, Hadoop, Kafka, Nifi or Spark or Cloud-based big data processing environments like Amazon Redshift, BigQuery and Azure Synapse Analytics Experience of Big Data technologies such as Hadoop, Spark and Hive At Julia, T-SQL, PowerShell Experience working with Big Data Cloud based (AWS, Azure etc) technologies is advantageous
such as Python, R, SQL, etc. Big Data Technologies: Knowledge of big data technologies such as Hadoop
Intelligence Integration Experience working with Big Data Date Lake Architecture experience
(Excel, Word, Powerpoint) etc. Ability to work with big data sets. Educational requirements: Minimum B.Com
and data modelling techniques. Experience with big data technologies. Proficient in SQL and data querying
also maintaining a ‘big picture' perspective for translating financial management data into actionable insights