Answer:
A robots.txt file in web applications is used to prevent search engines from indexing certain parts of the site, controlling the visibility of content.
A robots.txt file in web applications is used to prevent search engines from indexing certain parts of the site, controlling the visibility of content.
You may be interested in:
Web Security MCQs