Search Engine Bytes
Expert Answers to Hot Topic SEO Questions
| Is Yahoo's Search Submit Basic critical to rankings? ...or a waste of time and money? | Three critical reasons for websites to avoid sharing an IP address. | |
| How to view Google's search results in foreign countries: Little-known insider tips! | Avoid tripping Google's duplicate content filters: How different must webpages be? | |
| Subdomains vs. Subdirectories: How their incoming link juice affects the overall site. | How to market your site in MySpace *without* looking like a spammer. | |
|
Is Yahoo's Search Submit Basic critical to rankings or a waste of time and money?
Answer: If you're already listed in Yahoo then you'll continue to be listed whether you pay this fee or not. Even so, the expanded reporting features and the free analytics you receive on each submitted URL are attractive and may be worth the $49 a year as an add-on to your regular SEO marketing plan. However, don't expect SSB to automatically increase your rankings. Your pages will still be subject to the current Yahoo ranking algorithm. If they aren't optimized correctly, or your incoming links are lacking, the paid inclusion won't help you in the least. On the other hand, if you're having trouble getting important pages indexed by Yahoo, then submitting them to Search Submit Basic will ensure those pages get regularly crawled by Yahoo. In that case, you'll probably find the service useful and worth the cost of the paid inclusion. But if your site is already getting indexed by Yahoo then there's little incentive to pay for something that you are already getting for free. By the way, the services we find most worth paying for at Yahoo are the Yahoo Directory and Yahoo's PPC program, Yahoo Sponsored Search. The $299 directory fee is still one of the best values in SEO due to the authoritative link you receive and their PPC program will ensure you both traffic and total ad control.
Three critical reasons for websites to avoid sharing an IP address.
Answer: Even though having a unique IP will not directly improve your rankings, there are three reasons why it's important for your site to obtain its own unique IP address.
Even though a unique IP address will not help your site rank better, it's a fairly cheap and common sense way to protect your site from these glitches. Even though they rarely happen, when they do occur they can lead to catastrophic losses of income. Here's the test: If your site is important to you, especially if it's making money, then we'd advise that you eliminate any chance of being effected by these glitches. Having a unique and dedicated IP address for each of your money-making sites is cheap and will help you sleep better.
Little-known insider tips on viewing Google's search results in other countries.
Answer: Google uses your computer's IP address, which is frequently the Internet address of your Internet Service Provider (ISP), to determine your location and the corresponding Google domain to send you to. Thus, even when you enter www.google.com while searching from the UK, you're automatically transferred to www.google.co.uk. Fear not! ...Google help to the rescue. Take a look at: How do I stop Google.com from redirecting to another Google domain? ...here's the important part: 1. Click on the "Google.com" link on the bottom right-hand side of the Google homepage. If for some reason that doesn't work, you can also use Google's geo-location parameter by adding gl plus the country code to your search URL. For example:
Here's a complete list of Google country codes. By the way, another cool thing about this feature is that it also allows you to see the AdWords ads that are being targeted to that country.
To avoid tripping Google's duplicate content filters, how different must webpages be?
Answer: There's no known exact percentage of duplicate content on a page that will trigger a duplicate content filter. However, if you have multiple pages on your site that contain the exact same content then it's a pretty safe bet you're going to trigger a filter. For example, if you have:
...you're going to have duplicate content problems. Although usually innocent duplications, all of the above instances have been known to cause Google's duplicate content filter to banish such pages to their supplemental index. On the other hand, if you're using a site-wide template that's identical for all pages (i.e., same headers, footers, navigational elements, etc.) that's not likely to cause a problem. Search engines are pretty good at recognizing and separating the static parts of your page from the unique parts of your page. However, we have seen instances where bloated template elements positioned before the unique content in the HTML code caused the page to be pushed into the supplemental index. For example, a very large menu in the left column positioned within the HTML code might cause a search engine to have to process too much identical text before it reaches the unique parts of the page. If many or all pages on the site use the same bloated menu, then these pages will all look alike. This causes them to go supplemental. One solution is to position the unique, indexable content of your page near the top of your HTML code. You can learn more on that topic in our report How To Optimize HTML Tables for Search Engines. In cases where we've seen the bloated template issue cause problems, there were several hundred items in the left-hand menu that the search engine had to plow through. So another solution may be to pay more attention to usability and uses smaller menus :) When search engines find duplicate content on multiple different sites, they generally index the original source and either remove pages that are duplications or else put them in the supplemental index. Typically they decide which page is the duplicate based on:
For the most part this is not a significant problem, as search engines usually identify the original source of the article correctly. However, if content theft is creating serious problems for you, be sure to read our report What To Do When Someone Steals Your Content. As a general rule, if a page is the same as another page, then it's likely to be removed for duplicate content. If a page is just similar to another page and contains some duplicate elements (such as the static, navigational elements or if it contains "snippets" of identical content from another page) it's generally not going to trip a duplicate content filter. Bear in mind, however, there are no ironclad rules about duplicate content "percentages". It's something you just have to pay close attention to and develop a feel for.
Subdomains vs. Subdirectories: How their incoming link juice affects the overall site.
Answer: We see no evidence that is true. In our experience, links from subdomains pack just as much punch as links from subdirectories or any other part of the site. However, if you consider the opposite situation, when pointing links to subdomain sections of your site, there is reason to believe that links to those subdomains won't help your overall site as much as if you pointed that link to the non-subdomain (i.e., subdirectory) sections of your site. Here's why: As we've previously mentioned in past SE bytes, getting a link to any page on your site is going to help the rankings of every other page on your site—each link you add not only boosts the page it links to, but also increases the overall value of your site from the search engines' point of view (SEPOV). This is part of the reason it's important to have deep links. There's strong indications that Google tends to view a subdomain as a website somewhat separate from the root domain. This point is best illustrated by the fact that a site can get multiple listings for the same keyword in Google using subdomains. For example, a search for Google brings up: http://www.google.com/ http://images.google.com/ http://earth.google.com/ http://maps.google.com/ http://video.google.com/ ...and so on. Using subdomains, Google controls the entire first page (actually most of first four pages) in a keyword search for Google. Therefore, there are plusses and minuses to this strategy. On the plus side, subdomains are very good in terms of controlling reputation management. After all, when a single company has all of the top listings they can more easily control what's being said about them at the top of the search results. However, when thinking in terms of getting links to your site, the subdomain strategy shows a weakness. While it's true that getting links to your subdomains will boost the rankings of all the pages on your subdomain, the juice from these links is less likely to splash over to the pages on your main domain or your other subdomains. That's because of Google's tendency to see them as different sites. On the other hand, if you combine all of your site in subdirectories below your main domain, every link you get to your site will increase the value of the site as a whole. So there's the tradeoff and the Achilles heel of the subdomain strategy. The bottom line is: with subdomains, the ranking improvement is more likely to be focused primarily on those pages on the subdomain. We say primarily because Google doesn't view subdomains as completely different sites. They clearly know the various subdomains of a site all belong to that site, they just choose to separate them out at the moment. Subdomain links do improve the rankings of the site as a whole, just not as much as if the link pointed to one of the pages on the main domain. We still feel that subdomains are generally a good way to organize your site—especially in light of the fact they can help you occupy several listings at once for a keyword. And we certainly wouldn't recommending altering an existing site to remove subdomains. If you're considering using subdomains for a new section of your site, ask yourself: "Is the new information on the site of such breadth and detail that a new website could conceivably be constructed to house the content?" If the answer is yes, then consider putting that content into a subdomain. For example, Google could have created a whole new site called GoogleVideo.com, but for branding purposes they choose to go with Video.Google.com. If the new section of your site doesn't meet the above criteria, then going with a subdirectory is likely the best move. For more in depth details, see our recent report: Mastering the SE Marketing Strategy of Subdomains.
How to market your site in MySpace *without* looking like a spammer.
Answer: MySpace links do carry some weight, though not a lot. Google does index pages on MySpace, including member profiles, as you can see here: site:http://profile.myspace.com/ And links on those pages are direct and free of nofollows, redirection, or noindex meta tags on the page that might otherwise prevent link juice from passing. That means search engines count those links. However, those links don't appear to have much ranking impact. Most MySpace profiles are PageRank 0 pages with few incoming links of their own. For example, Pyzam.com has about 300,000 inbound links, many of them from MySpace: link:http://www.pyzam.com/ ...yet doesn't appear to be ranking for many important MySpace-related keywords that we can see. Also, their homepage is a PageRank 4. If you've got 300,000 links and you're still a PageRank 4, then you know most of those links are not of especially high value. Also, there's a point of diminishing returns when all your links come from the same site. 10,000 links from one site is generally not as valuable as one link each from 10,000 sites. Obviously, it depends on which sites you're getting links from, but as a general rule having your links spread over a wide range of IP addresses is best. As you mentioned, links from MySpace profiles are typically acquired by creating your own profile, adding other people as your friends, then leaving comments on their pages with your links embedded, or sending them bulletins (similar to comments but sent to everyone on your friends list all at once). One problem with pursuing MySpace links is that if you push too hard you risk looking like a spammer. There are many aggressive MySpace marketers who use MySpace bots to automatically add friends and schedule the process of sending out comments and bulletins containing their ads or links to their sites. Just about anyone who signs up with MySpace will start receiving these fake friend requests before long, and many people despise them. If you spam the wrong person (like a high-profile blogger) too many times it could seriously damage your company's reputation. Still, you can get good results from MySpace marketing if you take the right, systematic approach.
Obviously this is a lot more work than doing it the automated way, but the upside is that you'll get some links as well as some fairly targeted traffic from these links. Most importantly you won't risk your company's reputation by looking like a spammer. We'd have to see the site and keywords you mentioned in your question, but our hunch is that your competitor's rankings are not primarily due to their MySpace links. If you dig a little deeper you'll likely find other factors, such as links from other sources besides MySpace, site age, or possibly even something a bit more blackhat. Finally, we suggest using our Website Quality Indicator Tool to perform a competitive analysis on your competitor's site. Chances are this will shed some additional light on why the site is ranking well. | ||
Nuk ka komente:
Posto një koment