In this thread, he recommends that when you make your robot.txt file, make sure you keep it uncomplicated. John said:
When possible, I'd really recommend keeping the robots.txt file as simple as possible, so that you don't have trouble with maintenance and that it's really only disallowing resources that are problematic when crawled (or when its content is indexed).
Surprisingly, John has suggested us to remove the robot.txt file entirely if it’s not needed but remember to make your file under 500KB, if you don’t want to make it more complex.
If you want more help regarding this then catch out Google Webmaster Help thread.