Google is beginning to shift its search ranking methods away from traditional means, such as creating a score based on the quality and quantity of external links pointing at a site. Instead, it is moving towards a system which could end up basing search rank upon usageinformation, either in unison with prior methods or as a complete replacement. This will require that optimisation services look at new ways of building sites so that they are able to rank highly for relevant search terms. As is often the case, this news only emerged afterGoogle filed a patent, the details of which might make it easier to get a behind-the-scenes look at how its search ranking may operate in the future.
The real issue that Google highlights in its patent is that ranking pages based upon a set number of search terms and their appearance within a site may allow site owners to use manipulative techniques to rise to the top of the rankings, despite the fact that they may not actually merit this position. When it comes to link scoring, the problem of older and more established sites easily outweighing newer sites with fewer inbound links, even if they may be more relevant, is also highlighted by Google.
Various ways of checking quality of web pages:
If Google moves towards putting an emphasis on usage data, then it will take into consideration other things, such as the regularity of site visits as well as how many unique users are being directed to a particular page. While this is something that is already taken into consideration for searches, the way in which it is applied to determine rank will become more significant.
A part of the patent application describes a process in which Google determines a set of circumstances that will define the type of results that a typical user will see. After a search query is made, Google will compile its results based on traditional scoring as influenced by optimisation services. This list of sites will then be further organised based on the usage of the pages, with different weighting applied in a way that will alter the arrangement of the original results.
When it comes to providing a score based on the frequency with which a page is visited,Google will have a number of different options at its disposal. It could simply rank a sitebased on the total all-time page views, or just those which have occurred within the past week. It might also do it on a week-on-week basis, comparing the growth in the number of views in order to find sites which might be most relevant. Geographical weighting mighteven be included, so that users from the country of the site’s origin might be considered to be more important than those from outside of this region.
The uniqueness of the users will play a similarly important role in determining a rank basedon site usage. Google could look at IP addresses, cookie information and a range of other data to determine how many unique users a site has received in the recent past. It will also need to eliminate those hits that have been generated by botnets or even visitors who own the site.
It seems likely that Google will take a hybrid approach to search ranking if usage data is to be considered, rather than throwing out the baby with the bathwater and relying solely on this new method.
The above article is composed and edited by Donna B. She is associated with many SEO Companies as their freelance writer and adviser. In her free time she writes articles related to SEO, optimisation services, Social Media etc.