Setting up a robots.txt file

Do you know how to set up a robots.txt file? Heck, know what one is? A robots.txt file is basically this neat little text file that sits in your website and tells spiders and webcrawlers (those are the same thing by the way) that they’re allowed to look for X files, but not Y or Z files.
So basically what you do is this. Set up a text file called robots.txt and put it in your root directory of your website. In it, it should have something like this:

User-agent: *
Disallow: /tmp
Disallow: /logs

This basically tells the robot to recursively not look in the /tmp folder nor the /logs folder where “/” is the root of the website.
If you want to learn more, you can find out more about robots.txt here.