Initio, Hadoop, ETL, Hive, SQL, Oracle, ExpressIT, and Unix
· Extensive Hands-on Hadoop Framework
environments including Unix, Docker, Kubernetes, Hadoop, Kafka, Nifi or Spark or Cloud-based big data processing Analytics Experience of Big Data technologies such as Hadoop, Spark and Hive At least 5 years' advanced experience
systems using distributed frameworks like Apache Hadoop. Collaborate with data analysts to understand data PySpark, Scala Big Data Processing Frameworks: Apache Hadoop, Spark, Flink Distributed Storage: HDFS, Cloud
DB
The
Integration experience, Elastic Search, Kafka (Hadoop), SOAP and REST, etc. Please note this is a 12-Month Platforms experience like Elastic Search, Kafka (Hadoop). Messaging protocols and API technologies experience
infrastructure, and big data technologies such as Hadoop, Spark, or Kafka. Knowledge of data governance
infrastructure, and big data technologies such as Hadoop, Spark, or Kafka.
Technology Skills: SQL Oracle DB MySQL SQL Server Excel Hadoop Azure AWS The Reference Number for this position
in the DevOps space Big Data platforms (Vanilla Hadoop, Cloudera or Hortonworks) Strong SQL query experience
languages (Sql) and hybrid systems/platforms SaS and Hadoop Knowledge of statistical models Ability to map/document