Everyone that knows the term SEO probably knows the importance of good keyword rich content and links. These two factors play a large as Google ranking factors. However, the Google algorithm is fairly complex and there are many lesser known ranking factors. Here are 7 factors that you really incorporate into your SEO knowledge base.
1) Site speed: Google wants the results that appear within its listings be not only relevant but to be from sites that load fast and are user friendly. Site speed is how fast it takes a standard browser to load your page. Slow loading pages will actually rank slightly lower than comparable fast loading pages. To check if your site has issues you could use Google’s webmaster tools. See “site performance” then “site speed” for the option. However, Google removed this option, so I recommend another tool called yslow. Yslow is a browser addon that gives information on problems that could cause a page to render slowly. Hopefully, Google will reintroduce the site speed option in Webmaster tools.
2) W3C Compliance: Similar to site speed issues, which are coding issues the overall quality of a site’s code can play a role in its rankings. The standard for what is considered correct or incorrect code is determined by W3C.org
Bad code not only is undesirable for potential ranking concerns, but it can actually prevent proper Google indexing. Coding issues can result in duplicate content problems and pages not even getting indexed at all. To check whether your site is W3C compliant check it at http://validator.w3.org/
3) Click through rates: Click through rate is the number of clicks your page receives from Google compared to the number of times your page was viewed within the search results (impressions). With Google Adwords click through rate is extremely important for determining how the sites are ranked. There has a been a a lot of talk about whether click through rate has been also used in determining the organic rankings. I believe that Google does give some weight to click through because it is a measure of quality. The easiest way to improve your click through rates is to write strong meta descriptions. The meta description tag is used by Google as the description that appears within the search results. Having unique meta descriptions for each page that are written for the user can vastly improve click through rates. To find out what your site’s click through rates (CTR) are you can see this information in Google’s webmaster tools under “Traffic” then “Search Queries” see CTR as a percentage.
4) Site popularity: Though it is a popular misconception that the most popular sites are the ones that are top ranked, Google does store these metrics and probably gives them some weight. The question is how much weight. There are companies that will artificially drive traffic to your site in hopes of improving rankings, but at this time I would recommend against using those services. I don’t believe the factor is as important as people may believe. Note Alexa rank is similarly looked at as a way to improve rankings. Alexa rankings are fairly easy to manipulate and are only based on a small sampling of web users.
5) Bounce rate: Bounce rate is a percentage the number of people who leave a site within the first second of visiting it. A topical web site will have a bounce rate of 50%. The goal is to get the bounce rate to be as small as possible. To lower your bounce rate make your site obvious so that when someone visits they know what you are about and that the should stay longer. Whether Google uses this stat for determining rankings is debatable but Google does keep traffic and they are focusing on quality and this is a obvious way to track how users feel about the quality of a site.
6) Social signals: As social media becomes increasingly important Google has focused more and more on the information it provides. Social signals are based on a site’s social impact. Social activity about a site does play a role in how it will be ranked. So tweets, retweets, likes, Google pluses, status updates, and other social shares are all seen as positives for a site’s rankings. I expect this ranking factor to play a greater and greater role in the near future.
7) Human manual review: Google has long stated that it takes to make the process of determining rankings automated and computer determined. However, for many popular search terms there is some human review. The reviewers use the “Google Quality Raters Handbook” to determine if top ranked sites should really be top ranked. Currently, Google has about 4,500 reviewers that help to improve the quality of the sites that well ranked. Making machines happy with your site is one thing, making a human reviewer happy is much more challenging. However, here are some advice: give your site quality unique content and make it user friendly. If someone would want to spend time to actually read what you have to say, then it is probably good. NOTE: The document that these reviewers used to determine rankings was leaked but Google has forced site owners to remove it. So it almost impossible to download a copy of Google Quality Raters Handbook pdf. However, I just happen to have a copy of the latest version feel free to email me for a copy.
Feel free to comment below.