the user agent can be spoofed. your administrator must make sure that it is indeed yahoo causing the problem. also sometimes robots can go nuts and this maybe an isolated incident. if the abuse continues then action will be necessary. msn hammered one of my sites once. drove the bandwidth up in a very short time. however, it never happened again so i didn't do anything about it.
if you have the same data for all products in static pages (html files outside of the store) as you do in the store then there is no reason for the spiders to index the store. further, this can be considered duplicate content which could very well be why google has grayed out the ranking meter for your store pages. again, there is nothing to gain by allowing a spider to index your store if all of your product's information can be indexed outside of the store.
you're better off with the static pages (SEO wise) then having all of your product information (hidden to many spiders) inside dynamic pages. so keep what you have. just disallow spiders from indexing your store.
one important thing here. i am assuming that duplicate content is the issue with google and possibly other search engines. exclude the spiders with robots.txt from indexing your store. if you find your placement has dropped, traffic has dropped, sales have dropped or any other negative effect after adding the robots text file then either delete the robots.txt or change it to allow everything.
robots.txt file - how to
create a new plain text file with notepad. name it robots.txt
to allow all robots to index everything add the following to the robots.txt file contents...
User-agent: *
Disallow:
to block all robots from indexing everything (not adviseable!)...
# warning this is not a good thing to do!
User-agent: *
Disallow: /
to exclude the store folder from all robots...
User-agent: *
Disallow:
Disallow: /store_folder_name/
the begining "/" is the public root so the path to the store folder is necessary. if your store is in the cgi-bin then you would do this...
User-agent: *
Disallow:
Disallow: /cgi-bin/store_folder_name/
if the store is in a sub folder of a folder in the public root (public_html or www or whatever) then do the following....
User-agent: *
Disallow: /members/store_folder_name/
this would look something like the following in a url...
http://www.domain.com/members/store_folder_name/agora.cgi
another thing to remember is any sub folders under the folder name you're excluding will not be indexed nor accessed by virtue of the trailing "/". so in your case the following will prevent spiders from accessing any file and/or folders in your store but allow indexing everywhere else...
User-agent: *
Disallow:
Disallow: /shopping/
whenever dealing with SEO fixes or changes you must monitor results over time to make sure you're not doing anything harmful. a simple mistake can have negative results. the fact that your store has been grayed out doesn't necessarily mean something really nasty is going on... just that something isn't right. you maybe slighly penalized in placement or you may suffer severe placement problems by not fixing it. it's hard to tell without finding exactly why google has a problem with your store. however, the fact that your TLD and static pages have ranking tells me that your site hasn't been banned.
add the last example to your robots.txt file then upload in ascii to your public root folder.
the robots.txt file will not in itself be harmful if indeed the problem is duplicate content. but then again, this is google and they don't bother with telling you why about anything only that they decided in their infinate wisdom that they decided to mess with your head. for them it would be no problem to do a site search form and tell you what the major problems are much like WC3 does. but they don't. they just do whatever they want and explain nothing to anyone. so keep an eye on changes in placement, traffic, page ranking and etc. any negative effect remove the robots.txt asap.
d