Examveda

Which distributed computing framework is commonly used for batch processing of large datasets and is often associated with Hadoop?

A. Apache Kafka

B. Apache HBase

C. Apache Spark

D. Apache Hive

Answer: Option D


This Question Belongs to Data Science >> Big Data And Distributed Computing

Join The Discussion

Related Questions on Big Data and Distributed Computing