e martë, 21 gusht 2007

Search Engine Bytes

Search Engine Bytes
Expert Answers to Hot Topic SEO Questions

Is Yahoo's Search Submit Basic critical to rankings? ...or a waste of time and money? Three critical reasons for websites to avoid sharing an IP address.
How to view Google's search results in foreign countries: Little-known insider tips! Avoid tripping Google's duplicate content filters: How different must webpages be?
Subdomains vs. Subdirectories: How their incoming link juice affects the overall site. How to market your site in MySpace *without* looking like a spammer.


Is Yahoo's Search Submit Basic critical to rankings or a waste of time and money?

  • I recently noticed that Yahoo changed their Search Submit Express program from a pay-per-click model to an annual fee per URL. They also changed the name to Search Submit Basic. In light of these recent changes, is there any value in paying $49 per URL for this submission?

Answer: If you're already listed in Yahoo then you'll continue to be listed whether you pay this fee or not. Even so, the expanded reporting features and the free analytics you receive on each submitted URL are attractive and may be worth the $49 a year as an add-on to your regular SEO marketing plan. However, don't expect SSB to automatically increase your rankings. Your pages will still be subject to the current Yahoo ranking algorithm. If they aren't optimized correctly, or your incoming links are lacking, the paid inclusion won't help you in the least.

On the other hand, if you're having trouble getting important pages indexed by Yahoo, then submitting them to Search Submit Basic will ensure those pages get regularly crawled by Yahoo. In that case, you'll probably find the service useful and worth the cost of the paid inclusion. But if your site is already getting indexed by Yahoo then there's little incentive to pay for something that you are already getting for free.

By the way, the services we find most worth paying for at Yahoo are the Yahoo Directory and Yahoo's PPC program, Yahoo Sponsored Search. The $299 directory fee is still one of the best values in SEO due to the authoritative link you receive and their PPC program will ensure you both traffic and total ad control.

top


Three critical reasons for websites to avoid sharing an IP address.

  • Our website is currently hosted on an IP address that is shared with over 9000 other domains. I would like to host our website on a dedicated server with its own unique IP address in order to score better in Google. Although we're number 1 for many relevant keywords, we'd like to improve our score and become even harder for our competition to beat.

    The question is: Is it worthwhile to have a unique IP address? Would it help our site rank better?

Answer: Even though having a unique IP will not directly improve your rankings, there are three reasons why it's important for your site to obtain its own unique IP address.

  1. If any site on a shared IP address network gets banned for spam, the penalties could be applied to every site listed on the entire IP address. Although this is an increasingly rare occurrence (these days Google is much more likely to blacklist domains rather than IP addresses) it can still happen.

  2. If you happen to be using any form of interlinked mininet you're strategically better off putting all your sites on different IPs and class C blocks. In fact, even if you're not interlinking your own sites, putting them on separate IPs is very important because it prevents competitors from using reverse IP lookup to find all of your sites.

    The reality is, a site doesn't necessarily have to do anything "bad" to get penalized. You only have to be the target of what somebody else "says" is bad. Over the past decade we've seen many instances where engines have penalized sites for infractions they think they committed, while at the same time ignoring prominent sites who were flagrantly guilty of the same so-called infractions.

    For this reason, it isn't in your best interest to make it easy for anyone to find all of your sites with a simple reverse lookup. Spreading them around on different IP's and Class C's is just good precautionary strategy regardless of whether or not you're engaging in gray or blackhat SEO strategies.

  3. There have also been instances where shared IP sites are incorrectly crawled by Google. This can result in your site being cached under another domain. It may look like your site, but anyone who links to it will actually be linking to someone else. Obviously, this is never good.

Even though a unique IP address will not help your site rank better, it's a fairly cheap and common sense way to protect your site from these glitches. Even though they rarely happen, when they do occur they can lead to catastrophic losses of income. Here's the test: If your site is important to you, especially if it's making money, then we'd advise that you eliminate any chance of being effected by these glitches. Having a unique and dedicated IP address for each of your money-making sites is cheap and will help you sleep better.

up


Little-known insider tips on viewing Google's search results in other countries.

  • I'm based in the UK and I'm trying to view results from Google US. Unfortunately, while I can view Google.ru, Google.au, Google.ca and so on, I can't query Google as though I were in the United States. Every time I do a Google.com search it sends me back to the Google.co.uk site.

    I would very much like to monitor our ranking from a US perspective. How do we do this?

Answer: Google uses your computer's IP address, which is frequently the Internet address of your Internet Service Provider (ISP), to determine your location and the corresponding Google domain to send you to. Thus, even when you enter www.google.com while searching from the UK, you're automatically transferred to www.google.co.uk.

Fear not! ...Google help to the rescue. Take a look at: How do I stop Google.com from redirecting to another Google domain? ...here's the important part:

1. Click on the "Google.com" link on the bottom right-hand side of the Google homepage.

2. If you have cookies enabled, your browser will connect directly to Google.com on all subsequent visits. For more information on cookies, please visit http://www.google.com/cookies.html and http://www.google.com/privacy.html

3. If cookies are disabled, you'll experience the same redirect each time you visit Google. You can solve this either by enabling cookies or setting a bookmark for http://www.google.com/webhp. In the latter case, you'll be taken to http://www.google.com/webhp, which is exactly the same as Google.com, each time you select the bookmark.

If for some reason that doesn't work, you can also use Google's geo-location parameter by adding gl plus the country code to your search URL. For example:

Here's a complete list of Google country codes. By the way, another cool thing about this feature is that it also allows you to see the AdWords ads that are being targeted to that country.

up


To avoid tripping Google's duplicate content filters, how different must webpages be?

  • How much does the content of a page need to change for it to be classed as unique and not duplicate content? I appreciate that it's a question that's a bit broad in its scope, but any advice would be appreciated.

Answer: There's no known exact percentage of duplicate content on a page that will trigger a duplicate content filter. However, if you have multiple pages on your site that contain the exact same content then it's a pretty safe bet you're going to trigger a filter. For example, if you have:

  • a printer-friendly page that's identical to your regular web page,
  • a blog with category and date archives featuring exact copies of your regular posts, or
  • a content management system that assigns multiple URLs to the same page...

...you're going to have duplicate content problems. Although usually innocent duplications, all of the above instances have been known to cause Google's duplicate content filter to banish such pages to their supplemental index.

On the other hand, if you're using a site-wide template that's identical for all pages (i.e., same headers, footers, navigational elements, etc.) that's not likely to cause a problem. Search engines are pretty good at recognizing and separating the static parts of your page from the unique parts of your page.

However, we have seen instances where bloated template elements positioned before the unique content in the HTML code caused the page to be pushed into the supplemental index. For example, a very large menu in the left column positioned within the HTML code might cause a search engine to have to process too much identical text before it reaches the unique parts of the page. If many or all pages on the site use the same bloated menu, then these pages will all look alike. This causes them to go supplemental.

One solution is to position the unique, indexable content of your page near the top of your HTML code. You can learn more on that topic in our report How To Optimize HTML Tables for Search Engines. In cases where we've seen the bloated template issue cause problems, there were several hundred items in the left-hand menu that the search engine had to plow through. So another solution may be to pay more attention to usability and uses smaller menus :)

When search engines find duplicate content on multiple different sites, they generally index the original source and either remove pages that are duplications or else put them in the supplemental index. Typically they decide which page is the duplicate based on:

  • where they first crawled the page,
  • which site has more "trust", i.e. better links, older domain, higher PageRank, etc...,
  • if the duplicate pages all link back to the original copy.

For the most part this is not a significant problem, as search engines usually identify the original source of the article correctly. However, if content theft is creating serious problems for you, be sure to read our report What To Do When Someone Steals Your Content.

As a general rule, if a page is the same as another page, then it's likely to be removed for duplicate content. If a page is just similar to another page and contains some duplicate elements (such as the static, navigational elements or if it contains "snippets" of identical content from another page) it's generally not going to trip a duplicate content filter. Bear in mind, however, there are no ironclad rules about duplicate content "percentages". It's something you just have to pay close attention to and develop a feel for.

up


Subdomains vs. Subdirectories: How their incoming link juice affects the overall site.

  • In the April 2007 issue you wrote:

    "Content added to subdomains of your trusted domain can be expected to rank immediately, whereas if you added that content to a new domain you'd likely have to wait a year or so and do lots of link building before you'll see that content achieve high rankings. So it makes sense to launch a new venture on a subdomain of an existing site, then move it over to your new site once that new site has established some age and trust."

    It has, however, long been held by some SEOs that links from subdomains of sites will have less value or be filtered in terms of passing PageRank. Can you illuminate?

Answer: We see no evidence that is true. In our experience, links from subdomains pack just as much punch as links from subdirectories or any other part of the site.

However, if you consider the opposite situation, when pointing links to subdomain sections of your site, there is reason to believe that links to those subdomains won't help your overall site as much as if you pointed that link to the non-subdomain (i.e., subdirectory) sections of your site. Here's why:

As we've previously mentioned in past SE bytes, getting a link to any page on your site is going to help the rankings of every other page on your site—each link you add not only boosts the page it links to, but also increases the overall value of your site from the search engines' point of view (SEPOV). This is part of the reason it's important to have deep links.

There's strong indications that Google tends to view a subdomain as a website somewhat separate from the root domain. This point is best illustrated by the fact that a site can get multiple listings for the same keyword in Google using subdomains. For example, a search for Google brings up:

...and so on. Using subdomains, Google controls the entire first page (actually most of first four pages) in a keyword search for Google.

Therefore, there are plusses and minuses to this strategy. On the plus side, subdomains are very good in terms of controlling reputation management. After all, when a single company has all of the top listings they can more easily control what's being said about them at the top of the search results. However, when thinking in terms of getting links to your site, the subdomain strategy shows a weakness.

While it's true that getting links to your subdomains will boost the rankings of all the pages on your subdomain, the juice from these links is less likely to splash over to the pages on your main domain or your other subdomains. That's because of Google's tendency to see them as different sites.

On the other hand, if you combine all of your site in subdirectories below your main domain, every link you get to your site will increase the value of the site as a whole. So there's the tradeoff and the Achilles heel of the subdomain strategy.

The bottom line is: with subdomains, the ranking improvement is more likely to be focused primarily on those pages on the subdomain. We say primarily because Google doesn't view subdomains as completely different sites. They clearly know the various subdomains of a site all belong to that site, they just choose to separate them out at the moment. Subdomain links do improve the rankings of the site as a whole, just not as much as if the link pointed to one of the pages on the main domain.

We still feel that subdomains are generally a good way to organize your site—especially in light of the fact they can help you occupy several listings at once for a keyword. And we certainly wouldn't recommending altering an existing site to remove subdomains. If you're considering using subdomains for a new section of your site, ask yourself:

"Is the new information on the site of such breadth and detail that a new website could conceivably be constructed to house the content?"

If the answer is yes, then consider putting that content into a subdomain. For example, Google could have created a whole new site called GoogleVideo.com, but for branding purposes they choose to go with Video.Google.com. If the new section of your site doesn't meet the above criteria, then going with a subdirectory is likely the best move.

For more in depth details, see our recent report: Mastering the SE Marketing Strategy of Subdomains.

up


How to market your site in MySpace *without* looking like a spammer.

  • Do links from MySpace.com carry a lot of weight with Google? I noticed a competitor has lots of links from MySpace, and it looks like they've placed their own ad in place of comments on many different MySpace profiles.

    I would think this is a risky strategy. Overall, their site is very poorly constructed, with lots of JavaScript, the same page titles on every page, missing alt tags, not a lot of text, etc. However, this month they rank #1 for a couple of my keywords.

Answer: MySpace links do carry some weight, though not a lot. Google does index pages on MySpace, including member profiles, as you can see here: site:http://profile.myspace.com/

And links on those pages are direct and free of nofollows, redirection, or noindex meta tags on the page that might otherwise prevent link juice from passing. That means search engines count those links.

However, those links don't appear to have much ranking impact. Most MySpace profiles are PageRank 0 pages with few incoming links of their own. For example, Pyzam.com has about 300,000 inbound links, many of them from MySpace: link:http://www.pyzam.com/

...yet doesn't appear to be ranking for many important MySpace-related keywords that we can see. Also, their homepage is a PageRank 4. If you've got 300,000 links and you're still a PageRank 4, then you know most of those links are not of especially high value.

Also, there's a point of diminishing returns when all your links come from the same site. 10,000 links from one site is generally not as valuable as one link each from 10,000 sites. Obviously, it depends on which sites you're getting links from, but as a general rule having your links spread over a wide range of IP addresses is best.

As you mentioned, links from MySpace profiles are typically acquired by creating your own profile, adding other people as your friends, then leaving comments on their pages with your links embedded, or sending them bulletins (similar to comments but sent to everyone on your friends list all at once).

One problem with pursuing MySpace links is that if you push too hard you risk looking like a spammer. There are many aggressive MySpace marketers who use MySpace bots to automatically add friends and schedule the process of sending out comments and bulletins containing their ads or links to their sites.

Just about anyone who signs up with MySpace will start receiving these fake friend requests before long, and many people despise them. If you spam the wrong person (like a high-profile blogger) too many times it could seriously damage your company's reputation.

Still, you can get good results from MySpace marketing if you take the right, systematic approach.

  1. Create a real profile for your company and only send your friend requests out to people whose profiles strongly indicate they would be interested in what you have to offer.

  2. Browse the MySpace Groups and see which groups would be relevant to your company. Join the most relevant ones and start interacting with the people who belong to them.
  3. Once you've established a bit of a rapport, you should have no problem making comments on people's profiles with a link to your site included. You could also send out the occasional bulletin to people on your friends list.

Obviously this is a lot more work than doing it the automated way, but the upside is that you'll get some links as well as some fairly targeted traffic from these links. Most importantly you won't risk your company's reputation by looking like a spammer.

We'd have to see the site and keywords you mentioned in your question, but our hunch is that your competitor's rankings are not primarily due to their MySpace links. If you dig a little deeper you'll likely find other factors, such as links from other sources besides MySpace, site age, or possibly even something a bit more blackhat.

Finally, we suggest using our Website Quality Indicator Tool to perform a competitive analysis on your competitor's site. Chances are this will shed some additional light on why the site is ranking well.

Nuk ka komente: