I would recommend making a robots.txt along the lines of:
User-agent: *
Disallow: /basket.php
Disallow: /user_login.php
Disallow: /admin
At least they wont't spider the basket and login pages or your admin folder with this (assuming they follow the rules). You could use a .htaccess file to do the same thing...
Put it in your web root.
Can ViArt products be spidered by web bots?
How come it doesn't create a default robots.txt for you? It looks to me like it leaves everything open to spidering by defult, which is slightly worrying.
Am I right?