If you’re using AccessAlly’s protected content, you might do a search in Google or another search engine and find some of your files are indexed and showing up in the results.
The good news is that no one can access these files unless they’re logged in and have the correct tags.
But it might not be the best user experience to have people click through to a protected file, so you might want to remove them from the search engines entirely.
Here’s how to do that!
Blocking Files In Robots.txt
The best way to block the search engines from indexing and linking to files you’d rather keep private is to edit your site’s “robot.txt” file.
Every website has this file at the root directory, and it tells the search engines what’s fair game and what’s off limits.
This is the text you’ll want to add to this file to stop the search engines from indexing your protected content files:
Two Main Ways of Editing Robots.txt
If you’ve never edited your robots.txt file before, there are a few different ways of doing it. Here they are, and the resources walking you through the steps: