In WebSite Auditor, you can create a robots.txt file for your site from scratch, or fetch the existing one to revise or change any of the restrictions. The tool also allows testing all the rules to see exactly how any page or resource will be seen by any bot.
Go to Pages module under Site Structure and click the WebSite Tools > Robots.txt button in the main toolbar to begin. In the wizard that pops up you can add a new rule, selecting the instruction, the bot it will affect, and the directory/page it will relate to.
If you have a robots.txt file already, you can click to Fetch From Server, and in a few seconds, you’ll see all the rules available in the file along with the robots and pages they affect. These rules can be easily removed, edited, or rearranged.
In the bottom tab, you can check the full contents of the file as it is, or test how the rules apply to each and every resource on your site.
Once you’re done with the rules, click Next. In Step 2 select a publishing option - to save the robots file on your hard drive or upload it to a website via FTP.
Depending on the option you select, in Step 3 you’ll need to either specify the destination folder along with the naming settings or configure FTP settings.
Then click Next, and the robots.txt file will be generated. In the last step, you can check the log or open containing folder to see the file.