In distributed computing, what is the term for a group of computers connected over a network that work together to solve a problem or perform a task?

What is the primary advantage of using distributed computing frameworks like Hadoop and Spark for big data processing?

Which distributed computing framework is known for its in-memory processing capabilities and is often used for iterative machine learning algorithms?

What is the main goal of data partitioning in distributed computing?