Saved Bookmarks
| 1. |
Robot.txt was used in the context of _____________. a. Hiding some pages from the search engine crawler b. Making the crawler look at these files easily c. Make the website sitemap d. Enable easier access to content |
|
Answer» ONG>Answer: The robots. TXT file is primarily used to specify which parts of your website should be crawled by spiders or web crawlers. It can specify different rules for different spiders. |
|