Notes, Tips, Questions, & Answers...
aka, topics too short for an article, but too important to leave out!
| Why adding more content pages to your site won't necessarily help you rank better. | Tips on how to determine if a site selling links is still passing PageRank. | |
| How to improve your search engine rankings with just a few lines of code. | What's better? One big site or lots of little sites? | |
| Does Google cross-check domain name ownership when indexing a site? | How to optimize for both the singular and plural versions of your keywords. | |
|
Will adding more content pages to my site help me rank better?
Answer: Theoretically this is correct—all pages are supposed to have at least a minimal value of link juice they can pass on by linking to other pages. In practice, however, it would take massive numbers of pages to actually have much impact, as the PageRank of each page would be truly minuscule. You would get much better results by simply acquiring links from other sites and pointing them at your most important pages. That way you'd get an influx of PageRank from an outside source that you could then distribute throughout your site. This is both more natural (from a search engine's point of view) and more effective than simply generating huge numbers of low quality pages on your site in an attempt to build up a critical mass of PageRank. Let's also not forget about sites like CSS Zen Garden, which is essentially a one-page site with a PageRank 9 that ranks on the first page of Google for the highly competitive phrase CSS. Clearly, ranking is about compelling content and great links, not lots of pages. One benefit of a large site is that having more pages means you can potentially target more keywords. This is particularly helpful in going after the long tail 3- and 4-word search phrases. In essence, you're casting a wider net by having more keyword-rich pages. For example, a blog that posts regularly is creating a huge number of keyword-rich pages, each of which could potentially rank and snag traffic for very-specific queries. However, this approach only works if many of those posts have their own inbound links. A site with huge number of pages but no inbound links to those pages is most likely destined to have much of their content banished to the supplemental results where it will languish in obscurity. If you can create large number of keyword-rich pages and get quality links to many of those pages, then having a large site is going to help you do very well in the search engines. But cranking out low-quality pages simply to produce page count is unlikely to have much effect. Note: Cranking out lots of low-value pages is often used when creating a large mininet of low-quality sites that could be pointed back to your main site. If you're into blackhat SEO, creating several sites with large numbers of bogus pages in an attempt to boost PageRank might be something to look into. But if you're goal is to rank highly in Google for the long term and you want to avoid getting banned, we'd recommend steering well clear of this approach.
Tips on how to determine if a site selling links is still passing PageRank.
Answer: It's difficult to establish this definitively, since a site may have many links and there's no clear way to determine exactly which are passing PageRank without isolating and testing those links in a vacuum. However, you can usually get a pretty good idea if you see a high-PageRank page selling links and you click on those links and see that many of the sites buying them still have very low PageRank. A good example is StatCounter.com. This is a PageRank 10 site that sells text-link advertising on its left-hand menubar. A PageRank 10 is extremely high, especially when you consider that even Yahoo only has a PageRank=9. StatCounter gets this high PageRank because they embed a link to themselves in the free web stats package they provide to millions of sites. It's a perfect example of building links by providing tools to your users. We've been watching those links and some are still at PageRank=4 or less, despite being linked from that page for several months. We can't say for sure, but that would indicate to us that this page is not passing PageRank. Since there's a small number of outbound links on the StatCounter homepage, one would expect that the pages it links to would get a greater PageRank boost. Again, this is only a guess on our part based on observation. More extensive testing would need to be done to determine if this site was actually passing PageRank. Also, because this is a highly trafficked page, some of the advertisers may be buying those links for the traffic they send rather than the PageRank boost. Also, be aware that a paid link can lose its value after you've paid for it. When W3C.org was caught selling $1000 links from its PageRank=9 supporters page, it quickly slapped a meta nofollow tag on that page in order to stay in Google's good graces. And all those sites that had just paid a grand for that link? SOL (Simply Out of Luck :^)) The site selling links typically has no problem continuing to rank well, but it can no longer pass PageRank. However, this could still have negative implications for the site if they are linking to some of their own domains, since they won't be able to help their own sites rank. Keep in mind that the backlink reports provided by both Yahoo Site Explorer and Google Webmaster Central will include links that aren't actually passing any ranking value to the page being linked to. For example, both tools include nofollowed links in their reports, as well as links from known link sellers who've lost their ability to pass PageRank. Just because a link shows up in these tools does not mean the search engines are counting it towards rankings. We actually rarely buy links ourselves, as our preference is always to obtain links through natural methods (mostly viral marketing and networking with other businesses). However, in some cases buying links and advertising can be useful, so you should know how to recognize a good deal. You also need to know when a site you're competing with is buying links so you know where their ranking power is coming from and what you need to do to compete. If you do decide to buy links from a site, be sure to check how the other sites that are buying links on that site are ranking. This will help you determine if it's worth your money.
How to improve your search engine rankings with just a few lines of code.
Answer: The ones you need to be most concerned with in regards to ranking are: http://domain.com/ http://www.domain.com/ If you've got links pointing to both the www and non-www versions of your site your incoming links are unnecessarily being split between what Google often sees as two different sites (they are getting better at resolving this, but they still frequently get it wrong). The following issue is not quite as serious: http://www.domain.com/ http://www.domain.com/index.html However, we've still seen this create duplicate problems occasionally, so it's a good idea to fix this as well. A 301 redirect can solve both these problems quite easily. We cover the 301 redirect extensively in our report How to Use a 301 Redirect to Keep Your Web Pages from Dropping Off the Face of the Earth, but here's the code to easily fix your site by adding just a small block of code to your .htaccess file... This will redirect: http://domain.com/ to: http://www.domain.com/ and redirect: http://www.domain.com/index.html to: http://www.domain.com/ ...thus solving most of the major duplicate content and link dilution issues with Google. Obviously, replace domain.com with your site's own domain name. Search engines generally understand that http://www.domain.com and http://www.domain.com/ (with and without the trailing slash) are the same URL these days, so addressing that one is not as important. However, adding the trailing slash when linking to your homepage will result in one less hit on your server (since the server won't have to tag the slash on itself) and a slightly faster response time for visitors, so it can still be useful. We highly recommend being consistent in how you link to your pages, both from internal links on your own site and in the inbound links you get from other sites. This results in less confusion about your pages on the part of the search engines. By implementing the .htaccess code listed above, people will always see your URLs formatted the same way, so you'll minimize the chances of them linking to your pages in the wrong format. Note: You can also specify that Google crawl either the www or non-www version of your site in Google's Webmaster Central control panel, and it's a good idea to do so. However, we recommend that you also utilize the .htaccess code above to guarantee they get it right and also ensure that other sites link to you using a standard format.
What's better? One big site or lots of little sites?
Answer: Generally, our preference is to work on developing a single authoritative domain rather than spreading incoming links around multiple sites. However, it really is a matter of both personal preference as well as accommodating whatever opportunities your business is facing. You can do quite well with several sites if each site has completely unique content and stands on its own as an independent resource. It will take longer to get those sites off the ground because you'll have to do link building on each, but once they're established it can be quite valuable to own several sites. A good example would be Shiny Media. They have a network of blogs all linked together, such as:
...and many more. Each site stands on its own. They're not about the same topics, they don't have the same content, and they generally only link to each other once, rather than many times. Also, since they are part of a blog network, it makes sense to link them together. And, of course, they have many links coming from outside sources rather than just relying primarily on sites within their own network to provide link popularity. On the other hand, if the sites you're linking together are mostly about the same topic and have very similar content, then that can cause problems (from a search engine's point of view) if there's no reason for the sites to exist except to artificially inflate link popularity. If you're planning to launch a new venture that's closely related to your existing site, consider using a subdomain on your existing site instead of creating a new site. In terms of getting ranked quickly, you can piggyback off of the trust and authority of an established site and there won't be any "sandboxing" or rankings delay for the content you add to the subdomain. If you'd still prefer to launch a new site instead, you might start the content on an existing subdomain so you can get search engine traffic right away while your new domain ages. In the meantime, add content to your new site and submit it to a few trusted directories to start the aging process. Make sure the content you add to your new site is different than the content you have on the subdomain of your main site. You want to avoid creating a duplicate content problem. Once the new site has built up some trust and ranking ability, you can 301 redirect your content to it from the subdomain on your older, trusted site. That way there's no down time on that content in terms of being able to rank in the search engines. Content added to subdomains of your trusted domain can be expected to rank immediately, whereas if you added that content to a new domain you'd likely have to wait a year or so and do lots of link building before you'll see that content achieve high rankings. So it makes sense to launch a new venture on a subdomain of an existing site, then move it over to your new site once that new site has established some age and trust.
Does Google cross-check domain name ownership when indexing a site?
Answer: Only the engineers at Google know for sure if Google does this, but we do know that it's something that would be relatively easy for them to automate. To shed light on the ease at which such a cross-check could be accomplished, take a look at this tool: http://www.widexl.com/scripts/domain-spider/. You'll see that their "Whois Domain Spider script is design(ed) to scan hundreds of thousands (of) domains for their status and time expiring and save the information in a MySQL database." Now, if a small company can do it, then it's easy to assume a multi-billion dollar company with hundreds of Ph.D. engineers like Google can do it with ease — plus a lot more. As a guideline, we've always applied the rule: if something is knowable, then assume it is known. Because Google can know and cross-check domain ownership, we must operate as though they do know and cross-check. Therefore, even if they aren't currently cross-checking, we know they someday will because it's just so easy and it makes so much sense for them to do so. If you're going to be running multiple sites and you're linking them together in a way that you think Google won't approve of (i.e. the classic mini-net strategy), then it makes sense to disguise that strategy as much as possible. That means different web hosts, different whois data, not using the same AdSense or affiliate code across your entire network of sites, different artwork and templates on each site, and so on. You don't want Google rooting out your network by either automated or manual review methods. Remember also that Google buys a lot of businesses, and at any point they could buy the web host, domain name registrar or even affiliate network you're using. Keeping your sites spread around makes it less likely that Google will be able to wipe out your entire network in one fell swoop.
Q&A - The best way to optimize for both the singular and plural versions of your keywords.
Answer: If you optimize using the plural version of a term like car, your site will be indexed for both car and cars, but that does not mean you'll be ranked equally for both terms. A search engine's results will often include pages that have variations on a word, but will give higher priority to pages that contain exactly that word. If your keywords are commonly searched for in both singular and plural versions, then it's a good idea to work both versions into your page, especially in your title tag. When optimizing your title tags, we recommend repeating the page's primary keyword one to two times (but no more than that). The repetitions should generally be some slight variation of that keyword, such as a plural. For example, if you're targeting Las Vegas vacations, a good title tag would be: Las Vegas Vacations - Las Vegas Vacation Packages - Vacation in Las Vegas NV Also important is the anchor text in the links pointing toward your site. Your inbound links are hugely influential in determining what you'll rank best for. While you're limited in the number of keywords you can optimize your title tags for, you can target a much larger range of tenses (plurals, singulars), modifier keywords (words like buy or find as in find las vegas vacations) and other related keyword via the anchor text of the links you have pointed at your pages. | ||
Nuk ka komente:
Posto një koment