designing and building scalable ETL systems for a big data warehouse to implement robust & trustworthy knowledge of emerging trends across Data/Analytics (Big Data, Machine Learning, Deep Learning, AI) Technical
designing and building scalable ETL systems for a big data warehouse to implement robust and trustworthy implementation and operations Trends across Data/Analytics (Big Data, Machine Learning, Deep Learning, AI) Expert
Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data PowerShell / Bash Glue CloudWatch SNS Athena
Microservices Solid understanding of Golden Gate Plug-ins (Big Data / Kafka), GG Directors and Cloud Console Cloud
equivalent 5 years of experience in the DevOps space Big Data platforms (Vanilla Hadoop, Cloudera or Hortonworks)
Microservices Solid understanding of Golden Gate Plug-ins (Big Data / Kafka), GG Directors and Cloud Console Cloud
trust-building, honest conversations, dreaming big, teamwork, ownership, continuous learning, performance
financial systems or system auditing experience from big 4 preferable Experience in using visualisation tools
Skills: General AWS experience in the DataScience / big data context. Docker, & Kubernetes Experience
platforms (e.g., AWS, Azure, Google Cloud) Knowledge of big data technologies (e.g., Hadoop, Spark) Experience