PostgreSQL)
Proficiency with big data tools (e.g., Hadoop, Spark, Kafka)
Familiarity with cloud services
Azure, GCP). - Knowledge of big data technologies (Hadoop, Spark). Working Conditions: - Onsite: This role
ong>
Understanding of several Big Data technologies like Hadoop, MapReduce and Spark as well as event processing teams. An understanding and hands on experience on Hadoop/Spark based distributed storage and computing frameworks
Understanding of several Big Data technologies like Hadoop, MapReduce and Spark as well as event processing teams. An understanding and hands on experience on Hadoop/Spark based distributed storage and computing frameworks
interactions Technologies: Big Data technologies (e.g., Hadoop, Spark) Machine learning frameworks (e.g., TensorFlow
interactions Technologies: Big Data technologies (e.g., Hadoop, Spark) Machine learning frameworks (e.g., TensorFlow
PostgreSQL) Proficiency with big data tools (e.g., Hadoop, Spark, Kafka) Familiarity with cloud services
stakeholders