designing and building scalable ETL systems for a big data warehouse to implement robust and trustworthy implementation and operations Trends across Data/Analytics (Big Data, Machine Learning, Deep Learning, AI) Expert
Microservices Solid understanding of Golden Gate Plug-ins (Big Data / Kafka), GG Directors and Cloud Console Cloud
equivalent 5 years of experience in the DevOps space Big Data platforms (Vanilla Hadoop, Cloudera or Hortonworks)
Software Engineer to join the team. This is your big chance to immerse yourself in best practices among
trust-building, honest conversations, dreaming big, teamwork, ownership, continuous learning, performance
Skills: General AWS experience in the DataScience / big data context. Docker, & Kubernetes Experience
a friend who is a technology specialist? We pay BIG CASH to you if we place a friend that you sent us
platforms (e.g., AWS, Azure, Google Cloud). Knowledge of big data technologies (e.g., Hadoop, Spark). Experience