Examveda
Examveda

What is the robots.txt file used for?

A. Blocking all web scraping activities

B. Allowing all web scraping activities

C. Providing guidelines for web scraping

D. Controlling browser rendering

Answer: Option C

Solution(By Examveda Team)

The robots.txt file provides guidelines for web crawlers, including what can be scraped.

This Question Belongs to Python Program >> Web Scraping With Python

Join The Discussion

Related Questions on Web Scraping with Python