200+ Google Ranking Factors to Optimize Your Website Better

We all know that Google uses more than 200 factors to rank a particular web page in its search index. Today, we are highlighting all these factors together. Below is a graph showing all the important parameters:

Some of the below factors were never confirmed by Google. However, they are listed according to expert opinions and numerous case studies. Scroll down for a detailed list of Google Ranking Factors: It will help you to optimize your website better.

Note: No one can tell you the concrete parameters used for ranking, except Google itself. However, this is just a small attempt to let you know how they rank webpages.

I. Page Level

1. Keyword in Title Tag

The title of the web page is the most important thing that you should worry about. Your page title appears in Google search results, which plays a crucial role in attracting visitors.

2. Keyword in Description tag (just to improve CTR)

This is not included in the ranking factors, but the meta description helps Google to analyze your content topic more specifically, which also helps you to increase your CTR in SERPs.

3. Keyword in H1 tag

H1 is the second title tag of the content; it helps search engines understand your content in a better way.

4. Content Length

Usually, long articles are more helpful than shorter ones. Search engines consider less than 300 words (not fixed) as thin content. Take this post as an example. Within 300 words, we can not explain the ranking factors for Google.

KeywordPositionWord Count
How To Conduct A Job Interview11810
How To Write A Cover Letter11254
How To Do Keyword Research11004
How To Groom A Beard1980
How To Hypnotize Someone12320
Most Expensive Software11075

5. Keyword as a first word in title tag

Title tags that start with a focus keyword generally perform better than titles that have a keyword in the middle or end position.

6. Keyword in the most frequently used phrase

Keywords distributed all over the content tell us the relevancy of the title and content, which is another ranking signal.

7. Keyword Density

The keyword density factor is still used by Google and more than 2% density might hurt your web page ranking.

8. LSI Keywords in Title and Description tags

Quality content may contain a few similar words in the description tag. LSI (Latent Semantic Indexing) looks for synonyms of the focus keyword stuffed in the title and description.

9. Duplicate Content

Duplicate or slightly modified content present on the same website can drop your ranking in search engine positions.

10. Page Loading Speed (HTML)

Web crawlers (also known as spiders) can estimate (and measure) your site’s speed based on page code and content size. Slow page load time (usually more than 6 seconds) will definitely drop your web rank and will increase the bounce rate, which is bad for SEO.

11. Page Loading Speed (Chrome)

Google also uses Chrome data to get more accurate information about page load times. Server response time and CDN services are also taken into account.

12. Canonical URL

Use Canonical URL to prevent the scenario of multiple URLs for the same content. If you have more than one URL for a single post, Google will consider them as multiple or duplicate data.

13. Recently Updated page

In the case of time-sensitive searches, Google Caffeine gives priority to recently updated content. Google displays the page’s last update date along with the search result.

14. Page Update History

The frequency of page updates also plays an important role in ranking. How often is the information present in the post updated; day, week, month, or year? This shows the freshness of the content on your blog.

15. Percentage of content updated

One can’t simply fool Google by adding a single white space or character in order to alter the update frequency. Along with the frequency, Google also analyses how much data has been updated.

16. Image optimization

The images present in the quality content are like the cherry on the cake. Don’t forget to put the title, alt text and description of the image. They help Google to understand what your image is about.

Please see also:  https://www.codingace.com/seo/optimizing-images

17. Early Keyword

Keywords appearing in the first 100-150 show the content relevancy. No one will read your article if you go off topic. The same fact is used by Google while ranking web pages.

18. Keywords in H2 and H3 tags

Keywords appearing in Header 2 or 3 might enhance your chances of ranking high.

19. Keyword in URL

This is another relevant factor – your keywords should be present in the URL instead of any random post id number. Besides this, keywords appearing in top level domains don’t give that much boost, but still act as a relevancy signal.


20. Keyword Word Order

An exact match between the query and the content phrase generally results in a higher search position. That’s why Keyword Research plays a crucial role in generating traffic.

21. Keyword Density Across Domains

Krishna Bharat introduced a new algorithm, Hilltop, which improved search by looking at the relevance of the entire website, labeled experts. The algorithm first evaluates a list of the most relevant experts on the query topic and then identifies the relevant links within the selected set of experts and follows them to identify target web pages. The targets are ranked according to the number and relevance of non-affiliated experts that point to them.

22. Outbound Links

The URL (s) you are referring to tells the search engine about the content relevancy. For example, if you write the word “Apple” and insert an outbound link to Apple.com, this will be interpreted as you are talking about the company, not the fruit.

23. Outbound link quality

Linking to a trusted website, having higher authority, generally helps you in a positive way. Never redirect your audience to any unknown language page or malware.

24. Spelling and Grammar

Nowadays, search engines check for both spelling and grammar, including natural language used in the content.

25. Relevancy of Outbound links

The theme of the outbound link should be similar to your content. If you are writing about “Fast cars” and linking to “Cooking methods”, this is obviously gon na hurt your SERPs.

26. Useful supplementary content

Along with your content, if you add a few extra things that are somehow helpful for users, that is supplementary content. For example, if you are writing about personal home loans, you can add an interest calculator, the interest rates of various banks, etc.

27. Number of Outbound links

There are 2 types of links. No-follow and Do-follow. Too many outbound Do-follow links can dilute the PageRank and so your search engine visibility.

28. Content Uniqueness

You have heard it so many times, “Content is King”. Well, that’s definitely true. Google is always hungry for a unique piece of stuff.

29. Number of internal links

The number of internal links shows the importance as well as the dependency of web pages present on your website.

30. Quality of internal links

Internal links from authoritative and well-indexed pages always have a higher impact as compared to thin-content pages.

31. Multimedia

Adding multimedia such as video, infographics, images, or audio files can help your audience grab information efficiently and quickly, which is another quality signal.

32. Broken links

Having too many broken links on a single page is a sign of a dead, old, inaccurate and not useful page. This might degrade your performance in the search engine.

33. Affiliate links

Affiliate links alone can’t hurt your site at all. But too many affiliate links can attract Google to reanalyze your website. And if they find “no added value content” or “content made for affiliates only”, no doubt, they will penalize your webpage.

Novel Document Content

Google devalues a lot more than just similar content. It has different patent methods for calling your article uninteresting. One of them is Novelty Score, which determines the amount of novel content contained in each document in the ordered sequence of documents. The novelty score is usually checked against a page and the overall website content.

35. Page Host’s Domain Authority

If similar content is present on two different pages in different domains, Google will give priority to the one having the higher Domain Authority.

36. PageRank of Page

In the previous case, if the domain authority is the same, then the priority will be given to the page with the higher PageRank.

37. URL path depth

A post closer to the home page directory generally tends to have higher authority as compared to far away pages from the home URL.

38. URL length

Excessively long URLs are not good from Google’s perspective. That’s why many SEOs have URL length cut off functions.

39. Reading level

This shows the ease of reading content. Too long paragraphs, ugly spacing, fonts, and formatting make your article hard to read.

40. Page Category

Similar content should be clustered under a single category. If the category page is indexed well in the search engine, then the post under the same category gets a boost.

41. WordPress Tag 41.

If you relate one content to another or a group of pages to other internal pages, somehow this is gon na affect your crawl efficiency as well as your webpage ranking.

42. References and Sources

Getting references from well-established sources or indicating a few of those sources in your content is a sign of quality content. For example, Wikipedia displays several external sources at the end of the article.

43. Pages Priority in Sitemap

Priority of pages is given through the Sitemap.xml file. You can set the priority from 0.0 to 1.0.

44. Too many outbound links

few pages consist of more than enough outbound links. According to Google’s perspective, they are only to distract users from content. The number is not fixed, but around 150 outbound links can be considered as a threshold.

45. Bullet or Numbered List

Bullet and numbered lists are more user-friendly and much easier to read as compared to paragraphs. That’s the reason Google more likely prefers listed content.

46. Layout Quality

If you are using a well-structured layout which gives immediate visibility to the main content, your webpage will rank higher in search results. Don’t let the user scroll to get to the beginning of the article he/she is looking for.

47. Close Keywords

The closeness of one word to the other implies association. For example, one paragraph about “second hand cars in New York” would rank higher (especially in New York) than two paragraphs about “second hand cars” and “New York”.

48. Age of Page

Google always prefers fresh content. However, the old content, which is updated regularly, gets more priority.

49. Useful Content

There is a huge difference between quality and useful content. Google prefers useful content. This can be best explained by taking an example. Suppose you are searching for “Pinched Nerve”. You got 2 results. The first one is from ans.yahoo.com and the second is from mayoclinic.com. The second one is written in a high professional language that contains formal and complex terms that are not clear to all, except doctors. Obviously, the second result is far better in quality, but not useful for a major audience. So, in this case, Google will prefer ans.yahoo.com.

50. Number of other keywords page ranks for

If a page ranks very well for other keywords, it represents the quality and popularity of the page.

51. LSI in Content

Latent Semantic Indexing searches for synonyms present in the content. For example, if your title is “New Cars”, the search engine would expect to find related words in the content like “BMW”, “Mercedes”, “Used Cars”, “automobile” etc. That means, having synonyms or related phrases might produce a fruitful result for your web page ranking.

52. Code Error on Page

Lots of HTML, CSS, JavaScript, and PHP errors and broken codes indicate poor quality webpages which are gon na hurt you in SERPs.

53. Page TF-IDF

Term Frequency-Inverse Document Frequency weighs the density of the keyword on a page in order to understand the article. It ignores words like “the”, “or”, “it” in computation and establishes how many times a human writer should probably mention a phrase in an article. It gives reasonable salience predictions.

54. HTTP Expires Headers

Invalid HTTP tags and expired headers can cause problems with search indexing by telling search engines that content will not be refreshed again for potentially a long time.

55. Human Editor

Although Google’s algorithms are not smart enough yet to detect all loopholes, they also allow humans (Manual Action) to influence SERPs in order to provide better results.

II. WebSpam

On-Page Factors

56. Panda Penalty

On 23rd February 2011, Google launched the first Panda algorithm in order to punish thin and duplicate content. This was the major step taken by Google that affected 12% of search queries.

57. Redirects

Sneaky redirects/cloaking are prohibited. Websites get banned or penalized for doing so.

58. Linking to Low Quality Content

Be aware of your external links. Linking to low-quality neighbors can affect you in a negative manner.

59. Page Over Optimization

Page over optimization is considered as gaming Google into tricks. These kinds of articles are written for search engines instead of for readers, and they don’t provide any value. Keyword stuffing is a great example of over optimization.

60. Site Over Optimization

This one is the same as above, but at domain level. It generally includes keyword stuffing, excessive keywords (and links) in the header or footer, hidden text etc.

61. PageRank Sculpting

Excessive PageRank Sculpting, i.e. adding the Nofollow tag to all outbound and internal links, might hurt you.

62. Meta tag Spam

It is usually observed that the majority of keyword stuffing is done in the Meta description. Repeating your focus keyword in the meta tag can penalize your entire website.

63. Affiliate Links

Sites with affiliate links are observed carefully and if your website contains a lot of distracting links directed at an eCommerce site, you deserve to be penalized.

64. Hiding Affiliate Links

If you are trying to hide a few affiliate links with cloaking, it is better not to do so.

65. Distracting Ads

Distracting ads like Pop up and Pop under are generally found on low-quality websites/content. This includes too many ads on a single page, high ads to content ratio, “download” ads, and extra-large and flashy banners.

66. Long Internal Link Anchors

Too long internal anchor text could draw keyword stuffing webspam penalties. Make sure your anchor texts are a reasonable length.

67. Overuse of Heading Tags

Make sure you are not using too many heading tags (h1, h2 or h3). Placing multiple h1 tags could result in a penalty. The best technique is to use a single h1 tag following a relevant number of h2 (and h3) tags.

68. High Link-to-Text Ratio

Having a webpage that is all links and no added content is a sign of a low quality site. You might get penalized for doing this.

69. Ads Above the Fold

Ads above the fold are OK. Even Google search results show ads in the top position. But too many ads that force the reader to scroll the page to find original content is not a sign of good quality. Remember, visitors are the first priority for Google. Distracting them from valuable content in any form could cause a penalty.

70. IP Address Spam

If the IP address of your server is already blacklisted by Google or if it contains too many spammy websites, this could affect your SERPs.

71. Autogenerated Content

According to the Google Guideline, if your site is delivering automated or computer generated content, it could result in penalty.

72. JavaScript and CSS hidden Content

If your site uses JavaScript, make sure you place the same content from the JavaScript in a <noscript> tag in order to avoid cloaking penalty. Also, make your CSS files accessible via Robot.txt file.

Off Page Factor

73. Penguin Penalty

On 24th April 2012, Google released the first Penguin algorithm which is designed to target link bombing and spam indexing. The aim is to drop the ranking of particular websites violating Google guidelines rules and still remain on top.

74. Percentage of Low Quality Links

The majority of low quality links are the result of Black Hat SEO, such as blog commenting, profiles in forms etc. If you don’t have backlinks from high-authority sites, your site can’t rank on top for highly competitive keywords.

75. Linking to Relevant Site

A large number of irrelevant or off-topic links can cause penguin penalty.

76. Unnatural linking

Unnatural linking refers to any linking process that is done with the intent of fooling search engines. Google sends thousands of manual penalty warnings and messages through the Webmaster tool to detect unnatural links. This likely results in a “ranking drop”.

77. Link Building on Same Server IP

Not only does Google observe the quality and quantity of backlinks, it also observes how and from where the site is gaining those backlinks. Getting links from the same server IP is a sign of blog network link building, which will not give you any benefit.

78. Anchor Text Spam

Getting a majority of backlinks from the same anchor text shows the sign of spam. Usually this results in a penguin penalty.

79. Selling Links

Gone are the days when buying or selling links helped you to rank high. Nowadays, the situation is exactly the opposite; doing so can penalize your website for building unnatural links.

80. Manual Penalty

In spite of all the smart algorithms search engines have, what if you still know a trick to rank high? In that particular case, Google can take manual action against any thin-quality content.

81. Disavow Tool

The Disavow tool helps webmasters tell Google not to follow a few specific links. If you believe there are a few links that can harm your site, you can send them through the disavow tool in CSV and txt format. The tool is generally used for lifting manual and algorithm penalties.

82. Chrome Blocked Sites

Google Chrome comes with a tool that allows you to block certain sites. There is no concrete proof that this is an automated ranking factor, but we also don’t believe that anybody (especially from the webspam team) is looking at this data.

83. SandBox

The Google Sandbox (a temporary name) is a concept which is used for brand new websites only. If a site suddenly gets a large number of ordinary links, they are put in a sandbox, which means they are temporarily out of the top pages.

84. Google Dance

In this concept,

Google runs the update on a specific day. Then it will gradually push out that impact over 10-15 days or so throughout the month. The cycle repeats every month.

85. Overuse of Bold, Italic and other emphasis

Bold tags are often given additional weight compared to the rest of the content. However, if you make your whole content (or major part) bold or italic, you could get penalized for “spammy activities”.

86. Reconsideration Request

The webmaster can send a reconsideration request in case of manual action. Within 2-3 weeks, Google replies to whether the penalty was revoked or not. However, a lot of webmasters say that they haven’t seen any change in traffic behavior even after the penalty was revoked.

III. Backlinks

87. Number of Pages Linked

This is the number of pages linking to your site. Whether links come from the same domain or from different sources, they matter.

88. PageRank of backlinks

Just one link from a higher PageRank (PR) site is more powerful than a number of links from sites having low PR. For example, it is more profitable to have 1 link from a PR 10 website rather than having 100 links from different PR 3 sites.

89. Linked Domain Age

The age of the website that is pointing you plays a very crucial role in ranking. The old domain has a much more powerful impact than the new one.

90. Number of links from separate classes

Links from different servers represent how wide and well known your site is.

91. Links from.gov,.edu and.org domains

Most people have a great misconception about links from other domain extensions, like.edu and.gov, being more powerful than the.com extension. Google is not racist. They give equal importance to all domain extensions. That means, it’s the quality of the link that matters, not the extension.

92. Authority of Backlinking Domain

The authority of a domain ranges from 0 to 100. It’s more difficult to get 50 from 40 than to get 20 from 10. The higher the page authority of a back-linking website, the higher you will be your profit.

93. Placement of Link (Content)

The link at the beginning is slightly more powerful than the link at the bottom of the content.

94. Placement of Link (Page)

Links present in the content body are given more priority than the links in the footer, header and sidebar.

95. Rapid Gain/Loss of Links

Rapid link growth or fall is highly likely to invite additional scrutiny from webspam filters. However, it seems fine when it comes from genuine (or high authority) webpages in case the content goes viral.

96. Social Share Link

Social Share can influence the value of a page that has been shared. In fact, large numbers of g+ page followers can improve your organic traffic, as the followers can see your post in search results.

97. “Our partners” Link

A few keywords like “sponsored links”, “our clients”, “advertisement”, “sponsors” etc. can degrade the value of links.

98. Anchor Text (Backlink)

Backlink anchor text often provides a small description of a web page. Nowadays, getting excessive links from the same anchor text is considered as a sign of webspam.

99. Anchor Text (Internal Link)

Internal anchor text links are also counted in the ranking factor, but they are not as powerful as backlink anchor text.

100. Link Title

Here we are talking about the link title that appears when you hover over the anchor text. Although they didn’t have much impact, they were still taken into account.

101. Alt text of Image Link

You didn’t get links through content text but also through images. Alt text is the anchor text of the image of a back-linking website.

102. Number of Root Domains

More importance will be given if a website’s home page points to your site. The quantity as well as quality of linking root domains play a crucial role in ranking web pages.

103. Content Links

Links embedded within the article are extremely powerful as compared to links in the comment section or forum profile.

104. Link Type

A high percentage of links that come from the same types of sources, like forum profiles, blog comments, is a sign of unnatural linking or spamming.

105. Backlink Age

Older backlinks are given higher priority over newer ones.

106. Competitor Links

Getting links from competitive websites or sites ranking for the same keyword in search engines can boost your performance.

107. Links from low-quality sites

A low quality site is one that contains duplicate or low-added-value content, spam, automated redirection, high ad to content ratio, nasty layout, etc. Getting links from these sites likely results in a penguin penalty.

108. NoFollow Links

Google says they don’t follow them, but in certain cases they do. For better results, you must add rel=”nofollow” to untrusted user comments, guest-posts and paid links.

109. 301 Redirects to Page

Excessive links coming through 301 redirect affects PageRank in a negative manner.

110. Country Specific Domain

Getting links from domains like.co,.uk,.in,.de,.cn,.us etc. can help your website rank higher in that particular country.

111. Linking Domain Relevancy

Generally, a link from a similar website is more powerful than a link from an off-topic website.

112. Linking Page Relevancy

Same as above, but this works at the page level, i.e. links from similar page content have a more positive impact than links from random pages.

113. Link Review

Google also analyses the words around the anchor text of the website pointing at your page. Somehow, the ranking depends on whether your site is part of a recommendation or a negative review. And, of course, negative emotions around links can hurt your ranking.

114. Similar Keywords in the Title

Links on the page that contain your focus keyword in the title tag are given more importance.

115. Link Velocity (Positive)

Link velocity is the rate of speed of links you are getting. For example,

Week 1: 10 backlinks

Week 2: 20

Week 3: 50

Since the links over time are increasing, Google will detect the increased importance of your website and rank you better.

116. Link Velocity (Negative)

This is the opposite case to the above; if backlinks are decreasing consistently, Google will detect the decreased importance of your site and drop your rank in SERPs.

117. Nasty or Unnatural Anchor Text

The excessive amount of brand name anchor text, “click here” anchors, and URL anchors naturally invite devaluation of a page in the search engine’s eye.

118. Link from Wikipedia

Links from Wikipedia are a sign of trusted information. Although the links are nofollow, they are still counted by Google.

119. Links from Top Source (s):

According to the Hilltop Algorithm, links from special and top resources (Hub) such as Techcrunch or Theverge likely carry more weight.

Authority Site Links

A website containing numerous backlinks from Top Sources (Hub) is generally known as an “Authority site”. Getting links from authority sites can boost your performance in search engines.

121. Co-occurrence:

Not only anchor text, Google also sees the neighbors of anchor text, which helps them to analyze whether the backlinks are an actual recommendation or just a part of spamming activity.

122. Links from spam blogs

Nowadays, the number of new blogs is increasing at an exponential rate and most of them are somehow involved in spam related activities. Getting too many links from new spammy blogs can harm your site.


123. Reciprocal Links

also known as link exchange, which is nowadays treated as a violation (if excessively done) of Google guidelines policy. And we all know the consequences of violating Google’s rules.

124. DMoz

Not officially confirmed, but experts believe that Google gives slightly extra priority to those sites that are listed on Dmoz.

125. Yahoo Directory Listed

Yet another not conforming factor; Google gives an extra priority to those sites that are listed in the Yahoo Directory. How long it has been listed is also taken into account.

126. Number of Outbound Links per Page

Backlinks from a page that already contains a hundred or even thousands of other outbound links pass very little PageRank.

127. SiteWide Links

A link that appears on all pages of a website is known as a sidewide link. These links are compressed to count all of them as a single link.

128. Content Length of Linking Page

Usually, a link from a page containing 1000 words is more powerful than a page with 50-100 words.

129. Quality of Linking Page

Links from spammy, duplicate, no-added-value contents are not very useful. In fact, they can harm the reputation of your site.

130. Forum Links

The level of spamming has increased tremendously in the last couple of years, in which forum profile generation and linking were the most common issues. That’s why Google reduced the value and importance of forum links.

131. User Generated Vs Actual Site Owner Links

Google can easily identify whether the link has been generated by a user with the intent of gaining backlinks or whether the link has been recommended by the actual site owner. A higher priority is given to the link that is embedded by the site admin.

132. Guest Post links

Guest posts have now become a virtual source of link bait and spamming. That’s the reason Google is no longer giving much importance to guest post links.

133. Natural Link profile

Considering all those above backlink factors, if the overall linking strategy looks normal (i.e. non-paid, not done with the intent to fool the search engine), then Google is obviously gon na rank you well in the search index.

IV. Website Level

134. Special Added value or Unique content

If your site is just copying or rehashing content from other websites, this is not going to help you. Google is always hungry for unique pieces of content, and if you can’t provide it, don’t accept organic visitors. Even if you are copying from other sources, try to include additional value that is useful for readers and can’t be found anywhere other than your site.

135. Site Layout

A layout should be as simple as possible where users can easily read, navigate, comment, travel from one page to another, and identify the difference between ads and real content.

136. Number of Pages

Not a very powerful factor, but it is sometimes used to identify the quality of a website. A thin website (generally affiliate sites) has a limited number of pages, whereas popular and high-quality sites such as Themeforest, Moz, and Mashable have thousands of pages.

137. Duplicate Meta Information

Duplicate Meta data can drop your performance in SERPs. That’s the reason most webmasters avoid indexing tags, archive and category pages.

138. YouTube

YouTube videos are always given special value. In fact, the traffic on YouTube has increased linearly since the release of Google Panda. You can generate more organic traffic by uploading videos related to your site’s content.

139. Site Usability

A site that contains nasty text colors/fonts, background, is difficult to navigate, and provides a bad user experience usually results in a high bounce rate, which can reduce your organic traffic. So always give attention to these onpage optimization factors.

140. Site Uptime

Frequent downtime due to maintenance or server issues can hurt your indexing position in Google.

141. Breadcrumb Navigation

This is a sign of good web structure that helps readers as well as search engines know the path direction of the opened page.

142. Responsiveness

Is your site compatible with other latest devices (tablets, smartphones)? The responsive site gets an advantage in search results on mobile and tablet platforms.

143. Contact Us page

You get a plus point if your website has a contact us page with sufficient and accurate information. It’s good if your contact details match the whois info.

144. Service or Policy Pages

These kinds of pages simply help search engines and readers gain trust in your website.

145. Server Location

Server Location is really important for Geo-specific searches. You might get great results from a particular country where your server has been placed.

146. Sitemap

A sitemap helps search engines find web pages and index them quicker. A Sitemap is important for new websites that don’t have any backlinks.

147. Site Updates

No one wants to read outdated content. Adding and updating content is a way to tell Google that your site is active. Moreover, adding 4 quality articles per week can boost your site’s ranking.

148. SSL

SSL stands for Secure Socket Layer, which is necessary these days for eCommerce websites. For blogs, websites broadcasting any kind of information don’t need to have SSL.

149. User Reviews

Being bad to your customers or readers is bad for business. The same fact is used by Google. Lots of negative reviews on popular sites (such as yelp.com), can degrade your website’s performance in the search engine.

150. TrustFactor

TrustFactor doesn’t have any specific algorithm, and it’s a general term that Google commonly uses. TrustFactor is a mixture of worthwhile backlinks, accuracy of data presented on websites, correct domain information and brand name.

151. Google Service

Two tools by Google, i.e. Analytics and Webmaster Tool, are used to analyze traffic and search indexing. Integrating your website with these tools will help you a lot. You can get exact real-time traffic data, bounce rate, landing pages, site speed, index status, crawl rate information and much more.

V. Special Algorithm

152. Wide Results

Google never returns the same type of information, especially in the top 5 results. For example, if you search for “Panda”, you will get diverse results that include Panda Animal (information/images/videos), the Google Panda algorithm and Panda Security company.

153. Fresh Results

Nobody wants to see outdated information, especially if the query is time-sensitive. Google gives priority to updating data in search results.

154. Browser History

If you have already signed into your Google account (Chrome), you will see that the website you visited the most sometimes gets priority in SERPs. These kinds of search results are specially optimized for a single user according to his likes and dislikes.

155. High Dwell Time

This is the actual duration of time that a visitor spends on a webpage before returning to the search engine. The longer the dwell time, the better it is for your website – this states that the visitor has consumed most of the content on a page before leaving your site.

156. Safe Search

Adult images, videos and information will automatically filter out if you keep the safe search option on.

157. GEO based results

Google gives priority to sites having local server IP and country domain extensions.

158. Search History

Your search history influences the later query results. If you search for “computers” and then again search for “review”, Google might return “computer reviews” results at the top of SERPs.

159. Crawl Rate Modification

The Google Webmaster tool allows you to modify the rate at which Googlebots crawl your site. For some reason, if you slow it down to zero, it can cause problems with indexing, which further creates a problem with ranking, especially if your website revolves around news or fresh content.

160. Quick Answer Box

The Quick answer box is designed to give you a short and useful answer without digging into any specific webpage. The answer is extracted from the webpage indexed by Google (usually from high authority & trusted sites).

161. Google+ Advantage

Your website will rank higher in the Google search engine for those people who have added you to the g+ circle.

162. Domain Diversity

Usually, a search result is classified as low quality if too many results are from the same domain. To provide diverse and quality information, Google returns different results from different domains.

163. Schema.org Format

Pages that support the schema.org format usually rank well in search engines. data types such as breadcrumbs, things, people, webpages and reviews are used to markup HTML pages.

164. Knowledge Graph

The Knowledge Graph represents semantic-search information gathered from a wide variety of sources. The aim is to provide users with structured and detailed information without navigating to other sites.

165. Local Searches

Google displays g+ results and reviews (on the very first page) for local destination searches.

166. Image Results

Organic traffic doesn’t come from only web searches but also through images. Google indexes your image format files (.png and.jpeg are the most common formats) in their image database. However, you can tell Google not to index your images using the Robot.txt file.

167. Special Searches

Google displays different results for different queries. For example, for shopping related keywords (buy/sell), you get top results from eCommerce websites.

168. Google News

Google news story box displays the cluster of article on same topic. From there you can get depth knowledge about certain events.

169. Single Site Results

If you search for a specific site page with a brand keyword, Google will return you numerous pages from the same site.

170. DMCA complaints

Getting too many DMCA takedown complaints is a sign of duplicate and thin content, which can drop your ranking in SERPs.

171. Easter Egg Results

Google has more than 15 “Easter Egg Results“. For example, searching for “Drag Queen” will return a rainbow navigation bar.

VI. Domain Rule

172. Keywords in the Domain

Having a keyword in the domain will always be an additional plus point for you. But not many powerful plus points at once as it was used to.

173. Keyword Position in Domain

For long-term benefits, you should get a keyword in the root domain at the very first position. This always has an edge over keywords in the middle or end of the domain name.

174. Hyphen-Separated URL words

The ideal method of separating words in the URL structure is to use hyphens instead of underscores. This is because underscores can be confused with programming language, preventing search engines from seeing URL words as separate keywords.

Using hyphens will not make a website rank higher.

175. Exact Match Domain

Exact Match Domain (EMD) has a plus as well as a negative effect. The plus point is, if your site contains unique quality, you can rank very high (1st – 5th result) for targeted keywords in less time. And the negative point is, if you have thin, duplicate or less than average content, you might get penalized by EMD algorithm.

176. Keyword in the Subdomain

A subdomain containing a focus keyword can boost your performance in SERPs.

177. Age of Domain

Google uses the domain age parameter in their ranking techniques. However, this factor is not so crucial. There is not a big difference between a 6 month old domain and a 1 year old domain. But the situation will be different if you compare a 1 year old website with a 10 year old website.

178. Domain Privacy

Don’t hide your domain privacy service until it’s very important. This can make your site less trustworthy than normal if you don’t have many visitors and backlinks. Also, make sure the domain registration and contact data doesn’t conflict with your privacy policy or any data displayed on the website.

179. Domain History

A site violating authorship of whois or numerous drops is a sign that tells Google to reset the domain history and remove all links pointing to the domain.

180. Domain Extension

Country top level domains (such as.au,.in,.ca,.de) help you to rank well for a particular country. On the other hand, it also limits the ability of a website to rank in other countries.

181. Domain Valid till

This one is a very tiny factor still in action. Valuable or high-authorized domains are usually registered for years or a decade in advance, whereas low-valued domains are rarely used for even one year. That’s why “when the domain is going to expire” can be used to predict the value of a domain.

For more details, please visit:  https://www.codingace.com/hosting/things-to-check-before-registering-a-domain

182. Penalized Owner

If Google identifies a particular owner (through whois data) doing spam or uploading malware, they could possibly restrict all other as well as future websites from indexing that are owned by the same person.

183. Parked Domain

Parked Domain is one that is registered for future use and doesn’t have any quality content. Sometimes they are filled with ads only. On 1st December 2011, Google released a bunch of algorithms in order to remove all parked domains from search index.

184. Domain Marked as Spam Permanently

Be careful while registering for a new domain. Check the history of a domain name before buying it. There are chances that the domain has already been marked as spam by Google because of its past behavior. The probability of getting indexed for the same name in the future is almost none.

Social Network Impact

185. Number of G+1’s

Although this was never confirmed by Google, case studies say that you get an advantage in SERPs if you have a large g+circle.

186. Authority of G+ user Account

Google gives more importance to the +1s that come from authoritative accounts (having a large number of followers).

187. Number of Tweets

A higher number of re-tweets can influence your rank in Google.

188. Authority of Twitter user Account

Tweets coming from authorized or large follower accounts usually have a greater effect than low-profile accounts.

189. Facebook Likes

Not very much, but Google sees the number of likes on the authorized page to check its popularity.

190. Facebook Shares

Fb shares have a stronger effect than likes. Sometimes, a large number of shares represents the quality of the content.

191. Authority of Facebook User Account

Just like Twitter and g+, shares/likes coming from popular accounts or pages carry much more weight.

192. Other Social Sharing Sites

Google might also gather information from other small social networking sites, such as LinkedIn, Stumbleupon, Reddit, etc., in order to know how a website is performing among different social accounts.

VIII. User Interaction

193. Direct Traffic

Google uses data from Chrome to show how often readers visit your site by directly typing in the domain URL. A large percentage of direct traffic is a sign of high quality and compelling content.

194. Bounce Rate

The bounce rate shows the percentage of visitors who leave the site after viewing a single page. A too high bounce rate can degrade website performance in search engines.

195. New vs. Returning Visitors

Google also checks whether or not readers get back to your page again or not. A large number of returning visitors might drag your rank a little upward in SERPs.

196. Google Toolbar Data

The Google Toolbar collects data such as pageload time in seconds, dwell time, landing page, bounce rate, etc. Google uses this data to analyze the behavior of the audience.

197. Chrome Bookmark

They also collect the bookmarked pages data. Articles that get bookmarked on Chrome might perform well on the Google search engine.

198. Organic CTR for specific keyword

Pages in search engines which have a higher click through rate usually get a boost in SERPs for that particular keyword.

199. Organic CTR for all keywords

If the click through rate is very high for almost all keywords, then overall site performance also increases in the search index.

200. Time Spend

Google also collects some data, such as how much time is spent by visitors coming through the search engine and social networking sites. If people from different locations spend a lot of time on your site, you will benefit in SERPs.

201. Number of Comments

The number of comments on each post represents the interaction as well as the interest of your audience.

IX. Brand Name

202. Brand preference

After the Vince update, Google starts giving favor to big brands. A brand is a mixture of high Trust, Authority, Reputation and PageRank.

203. Brand Anchor Text

Brand anchor text is a strong brand signal which is used while ranking your website.

204. Brand Searches

Google also looks at whether the audience searches for your brand keyword on the search engine. For example, if someone entered “most expensive software” or the same keyword followed by the brand name “most expensive software 3rank”, the second query would help Google recognize your brand.

205. RSS Subscribers

FeedBurner Rss is owned by Google. Obviously, they look at how many subscribers you have.

206. Brand on Twitter

A Twitter profile with lots of followers represents the popularity of the brand.

207. Brand on Facebook

Usually, a quality brand has thousands of shares and millions of likes on Facebook.

208. Brand on News

Big brands are often mentioned on Google and on other authorized news sites.

209. Non-Linked Brand

Google can easily recognize a brand with or without hyperlinks. It is not necessary that your brand name should always link to your site in order to get attention from search engines.

210. LinkedIn Page

It is found that most real businesses have an official page on LinkedIn. Google might take advantage of LinkedIn pages to get more information about the brand.

211. LinkedIn Employee

Not a very crucial factor, but according to Moz, LinkedIn profiles that say they work for you are considered as brand names.

212. Social Media Behavior

Getting thousands and millions of followers is not enough. How often you interact with them matters the most. An old profile with thousands of followers with just a couple of tweets can’t be considered as an active or well established brand.

213. Tax Paying Business

In some (rare) cases, Google might look into whether a site is a tax-paying business or not. If it is, you might get an additional plus point.

214. Brand Location

Big brands have offices where people work. Google checks for the location of your brand through g+ data and if it is there, they will index it into their database.

Recent Updates

215. SSL/HTTPS (per URL signal)

Google is now giving more priority to security. For now, HTTPS is a lightweight signal carrying less weight than other ranking factors, such as quality of content and backlinks.

216. Mobile Friendly Update

On 21st April, Google released a significant new mobile-friendly ranking algorithm that was developed to boost mobile-friendly webpages in Google’s mobile search results. The desktop SERPs weren’t affected by this update.

For further details:

SEO Tips to Consider while Launching a New Website

12 things every webmaster should take care of

WordPress SEO Best Practices For Beginners

Add a Comment

Your email address will not be published. Required fields are marked *

ABOUT CODINGACE

My name is Nohman Habib and I am a web developer with over 10 years of experience, programming in Joomla, Wordpress, WHMCS, vTiger and Hybrid Apps. My plan to start codingace.com is to share my experience and expertise with others. Here my basic area of focus is to post tutorials primarily on Joomla development, HTML5, CSS3 and PHP.

Nohman Habib

CEO: codingace.com

Request a Quote









PHP Code Snippets Powered By : XYZScripts.com