Robots.txt File to Disallow the Whole Website

1000
Bad bots

Do you want to block your whole website from all the bots and crawlers? If yes, then follow the following step and copy the code below. This Robots.txt file restricts all kinds of website bots and crawlers to access your website.

To block the whole website

User-agent: *
Disallow: /

To block the whole site from Google bots

User-agent: Google
Disallow: /

To block the whole website from Bad bots

User-agent: BadBot
Disallow: /

To allow a single bot and block rest

User-agent: Google
Disallow:

User-agent: *
Disallow: /

Also read this: Robots.txt to Disallow Subdomains