Saved Bookmarks
| 1. |
Robot.txt was used in the context of _____________.a. Hiding some pages from the search engine crawler b. Making the crawler look at these files easily c. Make the website sitemap d. Enable easier access to content |
|
Answer» Answer: The ROBOTS. txt file is primarily used to specify which parts of your website should be crawled by SPIDERS or web crawlers. It can specify different rules for different spiders. |
|