New Course
Concept
Robots.txt
Follow
0
Summary
The 'robots.txt' file is a
standard used by websites
to communicate with
web crawlers
and other
web robots
, instructing them on which
parts of the site
should not be processed or scanned. This protocol helps manage
server load
, protect
sensitive information
, and control the
indexing of site content
by
search engines
.
Relevant Degrees
Internetworking and Connectivity 78%
Automatic Control Technology 22%
Generate Assignment Link
Lessons
Concepts
Suggested Topics
Foundational Courses
Your Lessons
Your lessons will appear here when you're logged in.
Log In
Sign up
3