| Q&A - How do I find out which pages are sending traffic to my competitors? - Is there a way to know how much traffic is being sent to a competitor's domain by specific referring pages? I'm looking for a technique or tool that might help.
Answer: If the competitor is large enough you might be able to get data from Hitwise or comScore. However, they generally don't have detailed data on smaller companies, and accessing their data can be quite pricey. Alexa will give you a rough idea of the level of traffic, but won't tell you where that traffic comes from. Another option is to study their inbound links using Yahoo's Site Explorer. This won't tell you exactly which pages are sending a specific amount of traffic, but you'll be able to get an idea of where a site's non-search engine traffic is coming from. Spyfu.com is another very useful site in getting important intelligence on your competition. Spyfu operates their own search spider which crawls search results to see who is ranking for what queries and what terms they're bidding on in Google AdWords. Simply enter a domain name into their search interface and Spyfu will return a range of useful data relative to that site. By using Spyfu to see what a site is ranking for and what PPC keywords they're bidding on and using Yahoo Site Explorer to see who is linking to a site, you can build a pretty accurate picture of where their traffic is coming from. Cross-correlate that with their Alexa traffic stats and you should be able to get a pretty good general idea of a site's traffic profile. However, if you want to get exact numbers, you can occasionally find sites who've left their server logs publicly available. Sometimes this is intentional, such as when they use a free stat counter program that publishes the data. Other times the data is made public accidentally when a site forgets to password protect the directory that hosts their server log files. It's easy to scan a site using Google and see if Google has indexed any of the site's directories which are likely to contain stats files. For example, many sites store their stats in a folder like: http://www.domain.com/admin/logs/referrers Here we use domain.com as an example site, but you can see a real world example here: http://opendocument.xml.org/admin/logs/referrers In all likelihood, this site intentionally left its stats page public though other sites may do so unintentionally. Some common searches which will uncover similar directories include the following: site:http://www.domain.com/ inurl:stats site:http://www.domain.com/ inurl:statistics site:http://www.domain.com/ inurl:sitestats site:http://www.domain.com/ inurl:log site:http://www.domain.com/ inurl:logs site:http://www.domain.com/ inurl:referrers site:http://www.domain.com/ inurl:referrer site:http://www.domain.com/ inurl:referrals site:http://www.domain.com/ inurl:refer site:http://www.domain.com/ inurl:admin site:http://www.domain.com/ "site meter" site:http://www.domain.com/ inurl:awstats.pl site:http://www.domain.com/ intitle:"statistics of" Obviously, replace domain.com with the domain name of the site you're trying to find competitive intelligence on. While the above examples often will turn up log files for a given site, the more important lesson here is that competitors may be using these techniques against you if you haven't properly password protected those directories on your site containing your sensitive data. Remember that anything you upload to your site will likely be indexed by Google if you don't restrict Googlebot's access. Try running some of the above searches on your own site to see if Google has turned up anything you don't want your competition to see. Q&A - Will cloaking my site get me in trouble with the search engines? Answer: Everyone has their opinions on controversial issues such as this, and ours really doesn't differ that much from the mainstream. Cloaking is really about how and why you use it. The reality is that Google lets many sites use cloaking with their blessing. For example: - Many newspaper sites let Google's spider in to read their content but require a human to register and log in. The only way they can do that is to use an IP delivery system; aka, cloaking.
- Sites with content management systems that produce pages with uncrawlable session ID URLs for human visitors show Google versions of those same pages with simple, crawlable URLs. Some call that a content delivery system while others call that cloaking.
- Advertisers running PPC campaigns use IP-based cloaking to prevent Google from indexing their pages which are only intended for their PPC campaigns.
There's also the multitude of flash- and image-based sites which show Google a text-based version of those otherwise un-indexable pages. This is similar to using the noframes tag to show indexable content to a search engine that can't process frames. Again, the only way they can do that is to use an IP delivery system; cloaking. There is actually huge amount of legitimate cloaking going on all over the Internet that most people simply don't notice. Based on observing what Google does (not necessarily what they say) we can only conclude that Google is perfectly fine with it because, we suspect, such is not designed to deceive anyone. Cloaking only becomes a problem for search engines when you're deliberately showing search engines substantially different content than what you are showing your human site visitors. However, rather than dealing with explaining the numerous circumstances where cloaking is permitted, it's far easier for search engines like Google to simply say "Cloaking is bad". When it comes to SEO strategy, we don't make calls on what's "white hat" or "black hat" or even various shades of gray. What we do is tell you if a working SEO strategy is sustainable or non-sustainable. That is: will it work today, tomorrow, and for many years to come? Our practice is to help you find and employ the most sustainable techniques and strategies—those that will work for your company for the long haul. In regards to IP cloaking, our advise has always been very accurate. We've been using this strategy in numerous applications since it was introduced in 1996 and have sites with over ten years of problem-free cloaking. To be fair, there are sites that have been banned for cloaking. Those are the ones that did not use it correctly and caused a problem or a potential problem for the search engines. Our goal is to help you learn the difference so you can make an informed, intelligent decision relevant to the nature of your online business, your site and the keyword topics within which it operates. Q&A - Does it make sense to optimize my site's meta tags? Answer: First of all, submitting your site to search engines every week is worth nothing. The engines have already found your site. And even if they hadn't, it's better for them to find your site by following a link from another site already in their index. DC metatag development is a complete waste of time, Major search engines like Google do not index those specific tags. They are meant for places like universities and large companies to help manage their intranet document collections. We know this because we've been on the DC Meta Tag mailing list since its inception. We've also tested this just to make sure the engines aren't using them. They aren't. In fact, the only meta tags search engines pay any attention to at all are the title tag (some people consider this a meta tag), the description meta tag, and in some rare cases the keywords meta tag. We also see your current SEO company is simply sticking a keyword list in each of these tags, which is mostly useless. For instance, the meta description tag forms the description for your pages that people see in the search results. Its purpose is to convey a sales-oriented message that will entice people to click on your listing. A list of keywords would not be attractive to someone if they saw your listing in the search results. What's more, your objective should be to have a unique title and description for each page that focuses on the main keywords that are on the page, not just a list. Of the three tags just mentioned, the title tag is by far the most important for rankings and is something that you want to experiment with. Even though we aren't impressed with the look of your current page titles, at least they have your important keywords in them. As a general rule - make sure each page has its own unique title tag and use your primary keywords in your home page's title tag. Then, on the product pages, focus specifically on keywords that include product name, part numbers, etc. Finally, always be sure to use words that people are going to be searching for. For more on optimizing your title tags, see last year's report: The 7 Essential Title Tag Strategies of High Ranking WebPages in 2006 which contains strategies that are still completely relevant to title tags in 2007. Q&A - Is it worth my time to list my products in Google Base? Answer: We're currently working on a detailed article about Froogle/Google Base, and we can tell you that submitting a feed is very important. As you may know, Froogle no longer accepts feed submissions; everything is done through Google Base instead. A single Google Base feed can help your site be found in Froogle, Google Base, Google Maps, and the "onebox" that appears for many Google searches. Here's an example search that displays a onebox at the top of the search results (see the small shopping bag icon near the top of the search results): http://www.google.com/search?q=Canon+PowerShot+SD+800+IS Until our article is ready, we recommend your programmer spend some time exploring the documentation for creating and optimizing a Google Base feed. http://base.google.com/support/?hl=en_US The most important aspect of optimizing your Google Base feed is getting your keywords in the attributes fields of each product listed. Don't "stuff" the keywords in by repeating them over and over. Instead, decide on a set of related keywords you'd like each product to be found for, then work them into your product's title, description and other attributes in a natural and readable way. Intelligent use of keywords within attributes is the key to optimizing your Google Base feed. Q&A - What are the rules in regards to interlinking my own sites? - On the subject of having multiple sites on the same topic that interlink: I realize that if they are on the same IP, you probably don't want to interlink them. But if they are on completely separate IPs, would you want to interlink them as much as possible?
Also, does the domain name registration have anything to do with Google actually realizing the sites are from the same owners? Ideally, I would like to have hundreds of domains out there that have slightly different content but the same end result: helping me pull in more sales. However, I don't want Google to think I'm creating a mini-net. To avoid this, would you suggest different hosting companies? Could I have two domains on the same hosting company and just make sure they are in different class C blocks? This is very confusing to me. Answer: It depends quite a bit on what your intentions are with your site. If you are attempting to artificially manipulate your search rankings by creating multiple sites and linking them together (the mini-net technique) then you'd better do everything you can to prevent search engines from discovering that all of those sites are owned by the same person. This would involve hosting them with different web hosts (which means they'll be on different IP addresses and class "C" blocks). But this is not all you'll need to do. You'd also want to register each site with different whois info, use different contact info on your site, and use different design templates. In short, such a strategy dictates that you hide ownership of the various sites if you're going to be doing a lot of linking between your site. That's assuming, of course, that your goal is to "trick" the search engines into the impression that you have more links than you really do. Clearly, this requires a LOT of work. And, we don't feel it's a sustainable strategy. Unless you've really covered your tracks well, at some point search engines will find your mini-net due to the likelihood that it will appear as an isolated node—devoid of links from any other sites outside your ownership influence. As such, your site will probably be penalized or banned eventually causing you to either give up or to start all over. We talk about the different types of mini-nets and why some are risky in this report. We also discuss how to not make your sites appear to be a mini-net if you have multiple sites. It's a risky strategy, and you're much better served building legitimate links that will keep you on top of the search engines for the long term—and without having to worry that one day all of your sites will suddenly vanish from the search results. Q&A - Is it safe to make a significant change to the topic of my site? - If we want to use an old domain which was used in another industry and just change it to be about a new industry. Will that be better in terms of SEO success than simply starting a new site? This domain has been around since 2000 and is quite old and trusted, and I'd like to launch a new site on it to help avoid the Google Sandbox. However, the old domain is currently on a very different topic than the one I'm planning to change it to.
Answer: Although old domains do tend to have a strong ranking advantage over new domains, they don't necessarily gain a free pass to the top of the search results. One of the primary considerations is that, when a site is sold to a new owner and the whois info changes, Google will often detect this and erase whatever credit it was giving to the site's incoming links, PageRank and birthdate. Of course they do this to dissuade people from buying old sites just to cash in on that site's incoming link popularity. But, it's a tricky call because sometimes Google catches the change and sometimes they don't. They've also been know to let bigger sites keep their backlinks and PageRank credits while wiping the credit-ledger clean for smaller sites. If you're looking at buying an old site—either to make it your business' main site or to link it to your main site—use a tool like domaintools.com's whois history to see who owned that site in the past: http://domain-history.domaintools.com/ If the domain has changed hands many times then it's possible that Google doesn't regard the site's birthdate as the day it was launched, but instead only considers it to be as old as the last time the site changed hands. You can also use the Wayback Machine to see how a site has changed over time. For instance, take a look at the Wayback record for popular social media site Digg.com: http://web.archive.org/web/*/http://digg.com According to Wayback, Digg launched in 1998. However, click on the pages that Wayback indexed back then and you'll see the site was actually the homepage for a company called Digg Records: http://web.archive.org/web/19990125094800/http://www.digg.com/ By 2001 it appears Digg Records let the domain drop and it was parked using a generic placeholder: http://web.archive.org/web/20010202015800/http://www.digg.com/ Then, it wasn't until December 2004 that the site started getting spidered regularly and indexed as the actual Digg.com we all know and (maybe) love: http://web.archive.org/web/20041209040106/http://digg.com/ In such cases, Google typically sees the domain birthdate as 2004, not 1998. The point is that a site which has changed owners or themes may not be as "old" as it looks. Domaintools.com's whois history and Wayback's cached pages are a great way to get a feel for a site's history. By the way, Digg.com is also a great example of the exception to the rule that new sites can't rank well. Clearly it's true that strong marketing or an innovative concept that takes off in a viral sort of way can overcome the disadvantage of a site's recent birthdate. Other examples of new sites that are able to rank well for extremely competitive queries are real estate finder Zillow.com and Internet video site YouTube.com. So, domain age is not everything—but it helps. Whenever all else is equal, you'll be ahead of the game with an older domain. However, be sure to factor in the other ranking elements such as the links each has and the current topic of each. Also be aware that it isn't so unusual for site topics to change gradually over time. But if you shift them radically and too suddenly, you could create rankability problems for the site. So, shifting the topic of an old site isn't a strategy that you can count on to work. It's better if whatever topic changes you're making are gradual. If you must shift the theme of an old site, then strive to accomplish the change and in ways that Google won't consider it to be a new site. Perhaps you can add new sections while, temporarily, keeping the old sections. Then as your new sections take root in the rankings, gradually do away with the old sections that are no longer needed. Expect this to be a gradual process that can easily take a year or so to complete. |