I've been doing some domain name research recently and I wanted to know whether a specific domain was ranking more effectively than others in certain Google index.
You can do this in SB, but I did it using the Ninjasuite and Google Scraper.
First I researched a list of all the high value keywords from Google Adwords.
Then I queried the Regional Google Index with all those keywords just returning 10 results and capturing all the URLs.
I then ran the Goal Keeper Process (Ninja Suite) on the URL list tagging just the domain I was interested in (in this case .fr).
(In SB you could remove URLS containing a couple of times to get the same output)
1 list of URLS with that specific domain, and the rest.
What I found was that for this niche 82% of the domains were matching a unique domain type. 18% accounted for all the other domain extensions.
So the domain name I thought I should choose was supported by an 82% prevalence rate in the Google Regional index.
So just for fun and seing as I already had the data:
Filtering both lists for unique domains each list had only 150 domains.
In other words for a niche that is worth millions, Google's only showed results from 150 domains on it's first page.
Research Domain names the black hat way