The chief of Google’s anti Spam department Matt Cutts reviewed web sites at the 2006 PubCon in Las Vegas. Some statements in these public reviews might help SEO companies to improve the rankings of their clients web sites in Google, Yahoo and Bing search results.
Duplicate content is creating problems for many sites
One of the web sites that Matt Cutts analyzed had a problem with duplicate content. The owner of the web site had more than 20 different sites but re-used the same content over and over again for all the different pages Oliver Wood.
In addition, into the head section of the pages of the different web sites the same titles, meta description and keyword tags were implemented. This is calling for major problems and certainly does not boost the ranking in search engines.
By identifying the IP addresses of the sites you own, search engines can find out which other web sites are owned by you.
Matt said that varying the duplicate pages by adding a few extra sentences here or there or by changing a few words, wouldn’t work either.
This means that your content needs to be original and unique for each web site, displaying information about different themes and interests. Just copied content and pasting it, will not help the web site. This additionally applies to content copied from other web sites, which is really not a clever idea.
Very big sitemaps will cause problems for search engines
Another web site had top rankings in Google but it couldn’t get any meaningful ranking on Yahoo Search Results. The site had a very large sitemap (more than 100 links) that listed hundreds of articles on one page. This could trigger the filters of some search engines and the danger exists that these pages could be identified as cloaking pages. Matt Cutts suggested to split the sitemap into smaller pages with not more than 100 links on one page, this actually can prevent being black listed.
Good SEO should focus on high quality back links
If inbound links are built too quickly, they don’t have a positive effect on your rankings in search engines.
You can easily create oneway links to improve link popularity in search engines:
Reciprocal link exchange should however be from related sites that have something in common with your own site. For example if your web site is SEO related you should specifically exchange links with SEO, web design, SEO software web sites. Reciprocal link building with unrelated sites won’t help much.
Avoid session IDs and query strings in URLs
Matt Cutts say’s that it makes sense not to use URLs with session IDs (?=id-76). Evidence is that long URLs with many variables can cause problems with search engine spiders and crawlers. This is also explained in the Google’s help centre:
Having too many domains and private WHOIS might hurt your rankings
Matt Cutts indicated that it might hurt your rankings if you have too many domains and if you use these domains just to create web sites and display only PPC ads:
“Having lots of sites isn’t automatically bad, and having PPC sites isn’t automatically bad, and having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just maintains a single site or so.”
If you try to cheat Google Yahoo or Bing then it’s likely that one of the filters will apply to your web page and you could get banned from the search results.
Your site should be informative and search engine friendly. If you are the owner of a site, make sure that there are no technical errors that prevent search engines from indexing your web pages and only hire a professional search engine optimization company to improve ranking in search results, if you cannot manage yourself.