TechTV had an article on this, but they're going through some major changes so I hesitate to post a link; it may soon be invalid.
You have to put robots.txt on each server at the root directory for the website. The one for my webpage looks something like this:
---Start of file---
# robots.txt for
http://www.mywebpage.com/
# Disallow: /folder_name/
User-agent: *
Disallow: /
---End of file---
This will prevent all search engines using this method from looking in any of my folders. The first two lines are completely unnecessary; they are just instructions for the person writing the file.