The Google webmaster help center has been updated recently. There are two new sections which may prove to be particularly useful are:
Using a robots.txt file
New topics have been added in the How Google crawls my site section. These topics include information on:
How to create a robots.txt file
Descriptions of each user-agent that Google uses
How to use pattern matching
How often we recrawl your robots.txt file (around once a day)
Understanding HTTP status codes
This section explains HTTP status codes that your server might return when we request a page of your site. We display HTTP status codes in several places in Google Sitemaps (such as on the robots.txt analysis page and on the crawl errors page) and some site owners have asked us to provide more information about what these mean.