robots.txt file and sub domains

Hi All, another n00b question please,

I'm puzzling about how robots.txt works for sub domains, if I want to disallow robot to sub_domain.my_site_com, can I do:

1).

User-agent: *
Disallow: /sub_domain/

am I doing right? and how about for sub_sub_domain.sub_domain.my_site_com?

2).

User-agent: *
Disallow: /sub_domain/sub_sub_domain/

Or if I disallow spidering "sub_domain", is that disallow spidering "sub_sub_domain" too? I mean I just only need No. 1 commands, or do I need both No.1 and No 2 in the robot.txt file?

Thank you,

 

 

 

 

Top