This is the second part of the essential SEO tips for WordPress blogs covering the topics of Google Sitemaps plugins, pings and ping servers, valid (X)HTML, importance of a layout that puts post content ahead of sidebars and navigation, and displaying post excerpts and teaser text on the home page.
You should also check out other articles relevant to the SEO for blogs: How to Make a WordPress Blog Duplicate Content Safe and SEO for WordPress Part 1
WordPress is without question the most popular stand-alone blog platform. It is flexible and customizable; there are lots of useful plugins providing any functionality a blogger can think of. However, a fresh installation of a WordPress blogs leaves a lot for improvement. For instance, search engine optimization and duplicate content proofing.
Below is a rundown of useful tips that can help improving your blogâ€™s position in search engines as well as providing some additional benefits to your readers.
How much blog spam is produced in 5 minutes in a quiet Sunday evening? What is the ratio of spam blogs in the most popular blog services? To answer this question I present you the results of an experiment analyzing ping data and manually reviewing blogs.
The relative ease of creating and maintaining blogs makes them ideal tools for spamming search engines. Spam blogs or splogs serve two basic purposes: making money from advertising and affiliate programs, and participating in link farms. But making money from AdSense and providing nepotistic links are not what it takes to call a blog splog. Otherwise we would have to classify all blogs showing ads or promoting a business as spam; and there are thousands popular, quality blogs that would fall into this category. The distinctive feature of a splog, however, is that it has no use for its visitors. Should Google ban a splog from AdSense and prevent its links from passing on authority â€“ such a splog would have no more value or purpose of existence. So my definition of a splog would be â€œa blog with the only purpose of showing contextual or affiliate ads, or boosting link popularity of certain target sitesâ€.
Are article submissions worth the time? When doing manual submissions it takes me 10 to 15 minutes in average to login, format and submit an article to an article directory. I have to make at least 10 submissions to get a feasible exposure for my articles. So this process can take more than two hours a day! To make the most of my time I have to make sure that the article directories I am submitting to, are able to bring me as many visitors and backlinks as possible. I can name top five directories: EzineArticles.com, Buzzle.com, GoArticles.com, ArticlesFactory.com and WebProNews.com. Submitting to these is a must! EzineArticles.com and Buzzle.com can bring you a lot of traffic, GoArticles.com brings you backlinks, and ArticlesFactory.com â€“ PageRank (my profile page there now is PR5 â€“ with only 12 submitted articles).
Will the system described in the recent Googleâ€™s patent become a new ranking algorithm to augment the existing PageRank?
From the very beginning, Googleâ€™s distinctive feature was the hyperlink induced popularity ranking. Algorithms using text content to evaluate relevancy of web documents played much lesser role. The reasons to this disparity are purely pragmatical: authors of web documents have total control over their content and are at liberty to modify it to deceive ranking algorithms and get higher positions in search results. Hyperlinks however are much less influenced by webmasters and provide a more reliable measure of authority (link weight) and relevance (link anchor).
Now Google introduces a new way to evaluate relevancy of a web document based on its content which might prove itself to be immune to manipulation attempts Interested? Read on!
Reciprocal link exchange still is an important strategy of link popularity building despite all the measures taken by the search engines to diminish its effect. Back in 1999-2001 obtaining a quality link exchange was not difficult, and webmasters used to respond more willingly to an e-mail request. But as more people became aware of this strategy so the reciprocal linking scam started to be a common practice.
Sometimes I check my old â€˜link exchangeâ€™ e-mail account I used to build link popularity for my very first website. There are lots of people contacting me daily with exchange proposals. Well, not actually people â€“ they are mostly bots.
Probably one of the reasons I still maintain that e-mail is that those requests are a source of a persistent amusement for me. One example: a request in pink letters with images of dancing puppies and bouncing hearts written by a â€˜blond chickâ€™ (picture attached) asking me to link to her pharmacy site! Or maybe I just enjoy reading the admiring comments on the outlook and content of my site that precede every exchange proposal?
Link exchange scam is an interesting theme for a study per se and still awaits its researchers. But in the meanwhile the SEO community is being successful in summarizing the guidelines for the most perfect link exchange scam.
Recently I was approached by a colleauge from Portugal who offered me the following article on SEO and online advertising market in his country. I am gladly publishing this report by Nuno HipÃ³lito here. Interested? Read on!
Some blog for fun, some blog for money, some blog for both. There are numerous options to monetize a blog. AdSense ads, affiliate links, paid reviews, links to your products â€“ you name it. If your blog receives enough visitors you can start making living online. To make the most of your visitors you must keep in mind where do they come from. Those who arrive to your blog from search engine results or directed to you by links from other websites can see your pages fully. But your revenue-generating ads and links are hidden for those who read your RSS feeds. This means that your online money-machine loses click from a substantial portion of your most loyal visitors. Is there a way to make money in RSS feeds? Yes, try â€˜feedvertisingâ€™
Average Keyword Saturation for Google, MSN and Yahoo
When deciding upon keyword placement we all try to get the most out of our target keywords saturation. In the same time no one wants to get penalized by accidentally inserting too many keywords in the page copy, or by including too many words between H1 tags. Since search engines would never publish the exact numbers for maximally alowed keyword frequency or keyword prominence, all we can do is just study top pages in SEPRs and make more or less informed guesses. Or we can conduct an experiment, and calculate the average numbers for top pages in the results of the major search engines: Google, Yahoo! and MSN. For the tables below I used data provided by WebPosition software, which calculates the average scores of the top 5 positions for dozens of keyword searches conducted by WebTrends Inc. Interested? Read on!
In one of my recent posts I wrote about the duplicate content issue. This topic is especially important to me since my blog uses the WordPress content management system which, when used with the default configuration, is not duplicate content proof. In fact this CMS is capable to render almost 100% of your content duplicate. As usual the fault of the system has roots in its advantages. WordPress has many features facilitating blogging and linking, such as RSS feeds to posts and comments, trackback URLs, monthly archives and so on. In the same time this variety of URLs returning similar or identical pages represents a clear case of duplicate content. Interested? Read on!