What's the fastest way to boost your search engine page rank in Google?
The simple answer is "Link building". Google ranks your site based largely on context, so its sensible if you spend more efforts on link building. If you workout to get links from higher PageRank web sites, and your website page rank will go up in turn. Articles posted on reputable web sites like Buzzle & PRWeb really help boost your page rank quickly. If you have a blog or something similar, offering to write guest blogs can be great for your PageRank as well.
How to Sumit your site URL to Google, Bing and Yahoo?
Normally, we don’t need to tell Google/Bing/Yahoo to visit our site, it just does automatically. However if we want to we want to tell these search engines about your URLs, we can submit manually.
Submitting your website URL to Google
http://www.google.com/addurl/
You need to enter your full URL, including the http:// prefix. For example: http://www.google.com/. You may also add comments or keywords that describe the content of your page. These are used only for our information and do not affect how your page is indexed or used by Google.
You need to add Only the top-level page from a host is necessary; you do not need to submit each individual page. Google crawler, Googlebot, will be able to find the rest. Google updates its index on a regular basis, so updated or outdated link submissions are not necessary. Dead links will 'fade out' of our index on our next crawl when we update our entire index.
Submitting your website URL to Bing
http://www.bing.com/webmaster/SubmitSitePage.aspx
Generally Bing web crawler, MSNBot, can find most pages on the Internet. However if your site does not appear on Bing, you can send update the URL on above link.
Submitting your website URL to Yahoo
http://au.docs.yahoo.com/info/suggest.html
The goal of Yahoo! Search Technology is to automatically discover and index all of the content available on the web predominantly by following hyperlinks to provide the best possible search experience to users.
You can use the below link to submit to yahoo site explorer.
http://siteexplorer.search.yahoo.com/au/free/submit
You can submit your website or webpage and Site feed in invite more traffic from yahoo.
Submitting your website URL to Google
http://www.google.com/addurl/
You need to enter your full URL, including the http:// prefix. For example: http://www.google.com/. You may also add comments or keywords that describe the content of your page. These are used only for our information and do not affect how your page is indexed or used by Google.
You need to add Only the top-level page from a host is necessary; you do not need to submit each individual page. Google crawler, Googlebot, will be able to find the rest. Google updates its index on a regular basis, so updated or outdated link submissions are not necessary. Dead links will 'fade out' of our index on our next crawl when we update our entire index.
Submitting your website URL to Bing
http://www.bing.com/webmaster/SubmitSitePage.aspx
Generally Bing web crawler, MSNBot, can find most pages on the Internet. However if your site does not appear on Bing, you can send update the URL on above link.
Submitting your website URL to Yahoo
http://au.docs.yahoo.com/info/suggest.html
The goal of Yahoo! Search Technology is to automatically discover and index all of the content available on the web predominantly by following hyperlinks to provide the best possible search experience to users.
You can use the below link to submit to yahoo site explorer.
http://siteexplorer.search.yahoo.com/au/free/submit
You can submit your website or webpage and Site feed in invite more traffic from yahoo.
What is 301 redirect? And How Can I apply to my website?
What is 301 redirect?
It's the HTML redirection technique to inform search engines that the "page has permanently moved to a new location."
When can I use this function?
301 redirects are useful in the following circumstances:
You have moved your web site to a new domain, and you want to make the transition as seamless as possible.
People access your site through several different URLs. If, for example, your home page can be reached in multiple ways - for instance, http://myhome.com, http://home.myhome.com, or http://www.myhome.com - It's a good idea to keep one of those URLs as your preferred (canonical) destination, and use 301 redirects to send traffic from the other URLs to your preferred URL. You can also use Webmaster Tools to set your preferred domain. In this way you can prevent your page ranking split into 3 or more different URLS. (i.e - Google spider will update it's index for http://myhome.com, http://home.myhome.com, or http://www.myhome.com as 3 different websites)
You're merging two websites and want to make sure that links to outdated URLs are redirected to the correct pages.
How Can I apply to my website?
To implement a 301 redirect for websites that are hosted on servers running Apache, you'll need access to your server's .htaccess file. (If you're not sure about your access or your server software, check with your webhoster.) For more information, consult the Apache .htaccess Tutorial and the Apache URL Rewriting Guide.
In IIS, you need to change the "Home Directory" property to "A redirection to a URL" option and fill the new URL.
with PHP:
Header( "HTTP/1.1 301 Moved Permanently" );
Header( "Location: http://www.newlocation.com" );
with ASP:
Response.Status="301 Moved Permanently"
Response.AddHeader='Location','http://www.newlocation.com'
It's the HTML redirection technique to inform search engines that the "page has permanently moved to a new location."
When can I use this function?
301 redirects are useful in the following circumstances:
You have moved your web site to a new domain, and you want to make the transition as seamless as possible.
People access your site through several different URLs. If, for example, your home page can be reached in multiple ways - for instance, http://myhome.com, http://home.myhome.com, or http://www.myhome.com - It's a good idea to keep one of those URLs as your preferred (canonical) destination, and use 301 redirects to send traffic from the other URLs to your preferred URL. You can also use Webmaster Tools to set your preferred domain. In this way you can prevent your page ranking split into 3 or more different URLS. (i.e - Google spider will update it's index for http://myhome.com, http://home.myhome.com, or http://www.myhome.com as 3 different websites)
You're merging two websites and want to make sure that links to outdated URLs are redirected to the correct pages.
How Can I apply to my website?
To implement a 301 redirect for websites that are hosted on servers running Apache, you'll need access to your server's .htaccess file. (If you're not sure about your access or your server software, check with your webhoster.) For more information, consult the Apache .htaccess Tutorial and the Apache URL Rewriting Guide.
In IIS, you need to change the "Home Directory" property to "A redirection to a URL" option and fill the new URL.
with PHP:
Header( "HTTP/1.1 301 Moved Permanently" );
Header( "Location: http://www.newlocation.com" );
with ASP:
Response.Status="301 Moved Permanently"
Response.AddHeader='Location','http://www.newlocation.com'
Have you heard about Pay Per Call?
You might have heard about pay-per-click(PPC) or pay per impression (PPI) from Google adwords. Pay Per Call aka "Pay per phone call" (PPCall), "click to call" and "Call On Select" are relatively new marketing technologies that are sure to start getting more attention in the coming years.
Normally in Pay per click marketing, you can make your web site visitor to fill out the contact form to get back them later (to provide more details about your product before they choose to buy) / or provide your product details and ask your website visitors to contact you. The bad part is that your sales person may not even make contact with the prospect since people become busy or have already purchased through someone else. Finally the sales conversion is only the portion of pay per click compaign.
What is Pay Per Call then?
Pay Per Call marketing encourage the user to establish phone contact rather than sending them to a web site link and this is where the real potential lies. Pay per call advertising cuts down on the number of clicks that a user has to make in order to gain the information they need on a site.
How can I set up pay per call?
A pay per call ad is set up much like a Pay Per Click advertisement, except with a phone number added and highlighted.
There are two models:
1) A toll free number is provided to the user which allows the ad network to track calls made to the advertiser.
2) The user enters their phone number, clicks a button and the ad network dials the advertiser and the user simultaneously. The user picks up their phone and can hear the connection being made. As the advertising network is handling the connection for the call, the user is not charged for it. This also allows added privacy for the user as the advertiser is not provide with their phone number.
Who are the pay per call providers?
Google has "Click to Call". In some Adwords ads, a green phone icon is displayed. When the icon is clicked, a form appears prompting a user to enter their phone number. When the number is entered, Google calls the advertiser and the user and a connection is made between the two parties.
Ingenio provides toll-free numbers in Pay Per Call ads which forward directly to your business phone number. Ingenio charges on the first call from an identifiable phone number within a 30-day period. Subsequent calls to the toll free number within that time from the same number are free. Ingenio doesn't charge for short calls, hang-ups or unanswered calls.
FindWhat (aka Miva), works a similar model to Ingenio; i.e they provide a toll free number for users to call.
Normally in Pay per click marketing, you can make your web site visitor to fill out the contact form to get back them later (to provide more details about your product before they choose to buy) / or provide your product details and ask your website visitors to contact you. The bad part is that your sales person may not even make contact with the prospect since people become busy or have already purchased through someone else. Finally the sales conversion is only the portion of pay per click compaign.
What is Pay Per Call then?
Pay Per Call marketing encourage the user to establish phone contact rather than sending them to a web site link and this is where the real potential lies. Pay per call advertising cuts down on the number of clicks that a user has to make in order to gain the information they need on a site.
How can I set up pay per call?
A pay per call ad is set up much like a Pay Per Click advertisement, except with a phone number added and highlighted.
There are two models:
1) A toll free number is provided to the user which allows the ad network to track calls made to the advertiser.
2) The user enters their phone number, clicks a button and the ad network dials the advertiser and the user simultaneously. The user picks up their phone and can hear the connection being made. As the advertising network is handling the connection for the call, the user is not charged for it. This also allows added privacy for the user as the advertiser is not provide with their phone number.
Who are the pay per call providers?
Google has "Click to Call". In some Adwords ads, a green phone icon is displayed. When the icon is clicked, a form appears prompting a user to enter their phone number. When the number is entered, Google calls the advertiser and the user and a connection is made between the two parties.
Ingenio provides toll-free numbers in Pay Per Call ads which forward directly to your business phone number. Ingenio charges on the first call from an identifiable phone number within a 30-day period. Subsequent calls to the toll free number within that time from the same number are free. Ingenio doesn't charge for short calls, hang-ups or unanswered calls.
FindWhat (aka Miva), works a similar model to Ingenio; i.e they provide a toll free number for users to call.
Labels:
Adword,
Call On Select,
click to call,
FindWhat,
Ingenio,
Miva,
pay per call,
Pay per phone call
Google Analytics and Google WebMaster Setup
Google analytics is free web service from Google to track web visits and statistics of your web site. It will give excellant reports on your site usage, web site content and visitors in charts and graphs.
Google WebMaster tools are free web service from Google for webmasters. Basically it helps webmasters to check indexing status and optimize visibility of their websites.
using these tools webmaster can,
Check and set the crawl rate, and view statistics about how Googlebot accesses a particular site
Generate and check a robots.txt file
List internal and external pages that link to the site
See what keyword searches on Google led to the site being listed in the SERPs, and the click through rates of such listings
View statistics about how Google indexes the site, and if it found any errors while doing it
Set a preferred domain (e.g. prefer "google-page-ranking.blogspot.com" over "www.google-page-ranking.blogspot.com"), which determines how the site url is displayed in SERPs.
Google WebMaster tools are free web service from Google for webmasters. Basically it helps webmasters to check indexing status and optimize visibility of their websites.
using these tools webmaster can,
Check and set the crawl rate, and view statistics about how Googlebot accesses a particular site
Generate and check a robots.txt file
List internal and external pages that link to the site
See what keyword searches on Google led to the site being listed in the SERPs, and the click through rates of such listings
View statistics about how Google indexes the site, and if it found any errors while doing it
Set a preferred domain (e.g. prefer "google-page-ranking.blogspot.com" over "www.google-page-ranking.blogspot.com"), which determines how the site url is displayed in SERPs.
SEO Jargon/terms
If you are new to SEO, Most likely, you will come across certain SEO sepcific search terms/jargons which you've never heard of. This article is to make your life easy.
Here is a list of Search Engine Optimization related terms/jargons that you commonly come across on many web resources.
SEO: Search Engine Optimization the planning and adjusting of the content of a web page in order to improve its position in natural, organic search results, including modifications to code and displayed content.
SEM: Search Engine Marketing any marketing activity involving a search site, including advertising on search result pages, paying for placement.
SMO: Social Media Optimization
SERP: Search Engine Result Page, the page that display the results of a search.
Sitemap: a file created in XML format that helps search engine spiders distinguish the structure of your website and instructs them how often to crawl certain pages on your website (This is different from HTML sitemap).
Crawl: the action of search engines traversing through the Internet while updating their database of websites.
Spider: a piece of code (packet) that is sent out from a search engine to crawl the web to build and edit its search engine database.
Conversion Rate: metric to evaluate the effectiveness of a conversion effort the number of visitors who took the desired action divided by the total number of visitors in a given time period.
CPC Cost Per Click : the amount an advertiser pays an ad host each time a visitor clicks on the advertiser’s link. (see Pay Per Click)
CPM Cost Per Thousand: the cost per thousand people viewing an ad or listing.
Absolute Link: a link that displays the full path of a website URL that is linked to.
Anchor Text: the text that is clicked on to activate and follow a hyperlink to another web page.
Backlink: a link to a website.
CTR - Click Through Rate: the number of clicks on a link, as a percentage of the number of views of the link. (( no. of clicks / no. of views ) x 100)
White Hat: a term used to describe SEO techniques that adheres to proper and acceptable on-page and off-page optimization.
Black Hat: a term used to describe any SEO techniques utilized to manipulate the search engines.
Cloaking: a black-hat SEO technique that manipulates search engines by displaying specific content served up to the search engine spider that is different then what the normal surfer sees.
Delisting: the removal of a web page from a search engine’s results.
Description Meta Tag: a meta tag describing the content of the web page.
External Link: a link from another website that links to yours.
Reciprocal Link: a link from a website that links back to your site, in exchange for linking to that website.
One Way Link: an external link that does not require your website to link back to that site.
Internal Link: a link that exists and links to other web pages within your website.
Outbound Link: a link from your website to an external website.
Link Spamming: a black hat technique used to generate and acquire bogus external links to manipulate search engine rankings.
Index: a search engine's database, consisting of all the web pages it has crawled and recorded.
Meta Tag: Html elements used to provide structured metadata about a web page.
Alt Attribute: the description text that is associated with an image.
Keyword: the word(s) or phrase(s) a person types into a search box.
Keyword Meta Tag: a meta tag listing the main keywords and keyphrases that are contained on that web page.
Keyword Density: a formula to determine the frequency a keyword is displayed on a web page. The formula is the total number of words in al keyword mentions divided by the total number of words on a page. Keywords should fall between 2 and 8 % density.
Keyword Stuffing: a blackhat technique to manipulate search engines by overly displaying a keyword or keyphrase, unnecessarily.
Landing Page: the destination page a visitor arrives when clicking on a link.
PageRank: Google's indicator of a particular page's value.
Organic Traffic: traffic generated as a result of being indexed within a search engine (vs. paid traffic).
Paid Traffic: traffic generated as a result of using paid advertisements (vs. organic traffic).
Rank: the position of a web page within a search engine.
Robots File: file in site's root directory that instructs search engine spiders to ignore certain pages or directories.
Here is a list of Search Engine Optimization related terms/jargons that you commonly come across on many web resources.
SEO: Search Engine Optimization the planning and adjusting of the content of a web page in order to improve its position in natural, organic search results, including modifications to code and displayed content.
SEM: Search Engine Marketing any marketing activity involving a search site, including advertising on search result pages, paying for placement.
SMO: Social Media Optimization
SERP: Search Engine Result Page, the page that display the results of a search.
Sitemap: a file created in XML format that helps search engine spiders distinguish the structure of your website and instructs them how often to crawl certain pages on your website (This is different from HTML sitemap).
Crawl: the action of search engines traversing through the Internet while updating their database of websites.
Spider: a piece of code (packet) that is sent out from a search engine to crawl the web to build and edit its search engine database.
Conversion Rate: metric to evaluate the effectiveness of a conversion effort the number of visitors who took the desired action divided by the total number of visitors in a given time period.
CPC Cost Per Click : the amount an advertiser pays an ad host each time a visitor clicks on the advertiser’s link. (see Pay Per Click)
CPM Cost Per Thousand: the cost per thousand people viewing an ad or listing.
Absolute Link: a link that displays the full path of a website URL that is linked to.
Anchor Text: the text that is clicked on to activate and follow a hyperlink to another web page.
Backlink: a link to a website.
CTR - Click Through Rate: the number of clicks on a link, as a percentage of the number of views of the link. (( no. of clicks / no. of views ) x 100)
White Hat: a term used to describe SEO techniques that adheres to proper and acceptable on-page and off-page optimization.
Black Hat: a term used to describe any SEO techniques utilized to manipulate the search engines.
Cloaking: a black-hat SEO technique that manipulates search engines by displaying specific content served up to the search engine spider that is different then what the normal surfer sees.
Delisting: the removal of a web page from a search engine’s results.
Description Meta Tag: a meta tag describing the content of the web page.
External Link: a link from another website that links to yours.
Reciprocal Link: a link from a website that links back to your site, in exchange for linking to that website.
One Way Link: an external link that does not require your website to link back to that site.
Internal Link: a link that exists and links to other web pages within your website.
Outbound Link: a link from your website to an external website.
Link Spamming: a black hat technique used to generate and acquire bogus external links to manipulate search engine rankings.
Index: a search engine's database, consisting of all the web pages it has crawled and recorded.
Meta Tag: Html elements used to provide structured metadata about a web page.
Alt Attribute: the description text that is associated with an image.
Keyword: the word(s) or phrase(s) a person types into a search box.
Keyword Meta Tag: a meta tag listing the main keywords and keyphrases that are contained on that web page.
Keyword Density: a formula to determine the frequency a keyword is displayed on a web page. The formula is the total number of words in al keyword mentions divided by the total number of words on a page. Keywords should fall between 2 and 8 % density.
Keyword Stuffing: a blackhat technique to manipulate search engines by overly displaying a keyword or keyphrase, unnecessarily.
Landing Page: the destination page a visitor arrives when clicking on a link.
PageRank: Google's indicator of a particular page's value.
Organic Traffic: traffic generated as a result of being indexed within a search engine (vs. paid traffic).
Paid Traffic: traffic generated as a result of using paid advertisements (vs. organic traffic).
Rank: the position of a web page within a search engine.
Robots File: file in site's root directory that instructs search engine spiders to ignore certain pages or directories.
Labels:
Absolute Link,
Anchor Text,
Backlink,
Black Hat,
Cloaking,
Conversion Rate,
CPC,
CPM,
CTR,
Delisting,
External Link,
Reciprocal Link,
SEM,
SEO,
SERP,
Sitemap,
SMO,
spider,
Web crawler,
White Hat
What is the role of Robots.txt file in SEO?
The purpose of robots.txt file is to tell the search engine crawlers not to index the folders or files that you don't want to see in Search engines.
You need a robots.txt file only if your site includes content that you don't want search engines to index. If you want search engines to index everything in your site, you don't need a robots.txt file at all.
Robots are often used by search engines to categorize and archive web sites. Also known as "Robot Exclusion Standard" and "Robots Exclusion Protocol".
A robots.txt file restricts access to your site by search engine robots that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages.
For websites with multiple subdomains, each subdomain must have its own robots.txt file. i.e) If you have robots.txt file with domain.com , and no robots.txt file with subdomain.domain.com, the rules that would apply for domain.com will not apply to subdomain.domain.com.
Below Syntax is used to allow all files in the website
User-agent: *
Allow: /
This is to restrict all files
User-agent: *
Disallow: /
To selectively restrict folders
User-agent: *
Disallow: /tmp/
Disallow: /pesonal/
To selectively restrict specific file
User-agent: *
Disallow: /personal/mybankdetails.html
Some crawlers support a Sitemap directive, i.e) allowing multiple Sitemaps in the same robots.txt file.
Example-
Sitemap: http://www.domain.com/sitemaps/sitemap.xml
Sitemap: http://www.domain.com/news/newsitemaps/newssitemap.xml
You need a robots.txt file only if your site includes content that you don't want search engines to index. If you want search engines to index everything in your site, you don't need a robots.txt file at all.
Robots are often used by search engines to categorize and archive web sites. Also known as "Robot Exclusion Standard" and "Robots Exclusion Protocol".
A robots.txt file restricts access to your site by search engine robots that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages.
For websites with multiple subdomains, each subdomain must have its own robots.txt file. i.e) If you have robots.txt file with domain.com , and no robots.txt file with subdomain.domain.com, the rules that would apply for domain.com will not apply to subdomain.domain.com.
Below Syntax is used to allow all files in the website
User-agent: *
Allow: /
This is to restrict all files
User-agent: *
Disallow: /
To selectively restrict folders
User-agent: *
Disallow: /tmp/
Disallow: /pesonal/
To selectively restrict specific file
User-agent: *
Disallow: /personal/mybankdetails.html
Some crawlers support a Sitemap directive, i.e) allowing multiple Sitemaps in the same robots.txt file.
Example-
Sitemap: http://www.domain.com/sitemaps/sitemap.xml
Sitemap: http://www.domain.com/news/newsitemaps/newssitemap.xml
Google SEO Meta keywords and Meta Description
Meta elements generally are HTML elements used to provide structured metadata about a Web page. Such elements must be placed as tags in the head section of the HTML. Normally, Meta elements can be used to specify page description, keywords attributes. There are four valid attributes: content, http-equiv, name, and scheme. Of these, only content is a required attribute.
Early nineties, the meta information is used by most of the search engines to reach the right webpages. By late 1997, search engines realized that information stored in meta elements, especially the keywords attribute, was often unreliable and misleading. Most of these are used draw users into spam sites. Search engines began dropping support for metadata provided by the meta element in 1998, and by the early 2000s, most search engines had veered completely away from reliance on meta elements.
Unlike the keywords attribute, the description attribute is supported by most major search engines, like Yahoo and Bing, while Google will still use page contents to get the description.
So, from the search engine ranking point of view, these meta tags (esp keywords) are really useless.
Then, What are all the other areas that you can focus to improve your web site content which are SEO friendly? Here you go:
Your HTML Page title Think about right title which is matching the content of the page. Make sure you are not repeating keywords many times in the title so that Search engines detect your website as SPAM website:)
Image ALT Tags use alternative text (ALT tags)for all the images. If you have company Logo, make you sure you have the right description (different from the HTML title).
H1 tags When Search engines Detects text with HI tags, it understands, that is important text. You can consider use your proposed keywords as part of H1 tags.
Content Try to use your proposed keywords as part of the contents. Content is the top most priority for all the search engines. So, Prepare your content in such way you can fit your keywords correctly (of course naturally)
URL If you can form your webpage URL with your keywords, it will increase your SEO points. Try to avoid ids, numbers (most of the CMS will do) as part of the URL. if you have query string with number parameters, see if you can convert to meaningful keywords (using URL rewrite tools)
SiteMap Create a clean sitemap and submit via Google webmaster tools. This will tell the web crawler to search and index your webpages correctly.
Early nineties, the meta information is used by most of the search engines to reach the right webpages. By late 1997, search engines realized that information stored in meta elements, especially the keywords attribute, was often unreliable and misleading. Most of these are used draw users into spam sites. Search engines began dropping support for metadata provided by the meta element in 1998, and by the early 2000s, most search engines had veered completely away from reliance on meta elements.
Unlike the keywords attribute, the description attribute is supported by most major search engines, like Yahoo and Bing, while Google will still use page contents to get the description.
So, from the search engine ranking point of view, these meta tags (esp keywords) are really useless.
Then, What are all the other areas that you can focus to improve your web site content which are SEO friendly? Here you go:
Your HTML Page title Think about right title which is matching the content of the page. Make sure you are not repeating keywords many times in the title so that Search engines detect your website as SPAM website:)
Image ALT Tags use alternative text (ALT tags)for all the images. If you have company Logo, make you sure you have the right description (different from the HTML title).
H1 tags When Search engines Detects text with HI tags, it understands, that is important text. You can consider use your proposed keywords as part of H1 tags.
Content Try to use your proposed keywords as part of the contents. Content is the top most priority for all the search engines. So, Prepare your content in such way you can fit your keywords correctly (of course naturally)
URL If you can form your webpage URL with your keywords, it will increase your SEO points. Try to avoid ids, numbers (most of the CMS will do) as part of the URL. if you have query string with number parameters, see if you can convert to meaningful keywords (using URL rewrite tools)
SiteMap Create a clean sitemap and submit via Google webmaster tools. This will tell the web crawler to search and index your webpages correctly.
Google Toolbar Page Rank
The Google Toolbar's PageRank feature displays a visited page's PageRank as a whole number between 0 and 10. The most popular websites have a PageRank of maximum 10 (e.g www.wikipedia.com). The least is 0. Google has not disclosed the precise method for calculating the Toolbar PageRank value.
The Google Toolbar is updated approximately 5 times a year, so often shows out of date values. It was last updated on 13/14 February 2010.
The Google Toolbar is updated approximately 5 times a year, so often shows out of date values. It was last updated on 13/14 February 2010.
How is the Google PageRank assigned to my website?
This is the common question amongst all the website owners. This article will give you an easy answer on the factors you need to know about your web site’s Google PageRank.
The reason for the Page Rank calculation by Google/Bing/Yahoo is to assess the popularity of the website objectively and accurately.
If your website have many incoming links (Links to your website from other websites), your site will be popular. i.e) you will have more page rank.
1. Number of Incoming Links:
Every incoming link to your website will increase the popularity of your website. In other words, Getting a search results of your website
at Google's top place is like competing in a Popularity contest. If you get more votes (links) you will be the winner.
2. Quality/Popularity of the Incoming Link website:
There is a minor extension from the point 1. The page rank calculation is not simple muliplication of number of incoming links. It depends on the quality (page rank) of the External website which is linking to your website.
i.e) if you have 40-50 links from unknown/unpopular websites - for e.g "www.unknownwebsite.com", and one or two links from the high quality (high page rank like wikipedia.com) website - for e.g "www.popularwebsite.com", the cumulative score for page rank with "www.popularwebsite.com" will be more than "www.unknownwebsite.com"
In summary your web site should have more quality incoming links to get a high page rank in Google searches.
The reason for the Page Rank calculation by Google/Bing/Yahoo is to assess the popularity of the website objectively and accurately.
If your website have many incoming links (Links to your website from other websites), your site will be popular. i.e) you will have more page rank.
1. Number of Incoming Links:
Every incoming link to your website will increase the popularity of your website. In other words, Getting a search results of your website
at Google's top place is like competing in a Popularity contest. If you get more votes (links) you will be the winner.
2. Quality/Popularity of the Incoming Link website:
There is a minor extension from the point 1. The page rank calculation is not simple muliplication of number of incoming links. It depends on the quality (page rank) of the External website which is linking to your website.
i.e) if you have 40-50 links from unknown/unpopular websites - for e.g "www.unknownwebsite.com", and one or two links from the high quality (high page rank like wikipedia.com) website - for e.g "www.popularwebsite.com", the cumulative score for page rank with "www.popularwebsite.com" will be more than "www.unknownwebsite.com"
In summary your web site should have more quality incoming links to get a high page rank in Google searches.
How search engines work?
Web search engines work by storing information about many web pages, which they retrieve from the html itself. These pages are retrieved by a Web crawler (also known as a spider) — an automated Web browser which follows every link on the site.
The contents of each page are then analyzed to determine how it should be indexed (i.e, words are extracted from the titles, headings, or special fields called meta tags). Data about web pages are stored in an index database for use in later queries. The purpose of an index is to allow information to be found as quickly as possible.
When you are searching in Google/Bing/Yahoo with specific keywords, the engine examines its index and provides a listing of best-matching web pages according to its criteria, usually with a short summary containing the document's title and sometimes parts of the text.
Most search engines support the use of the boolean operators AND, OR and NOT to further specify the search query. Boolean operators are for literal searches that allow the user to refine and extend the terms of the search. The engine looks for the words or phrases exactly as entered. Some search engines provide an advanced feature called proximity search which allows users to define the distance between keywords. There is also concept-based searching where the research involves using statistical analysis on pages containing the words or phrases you search for. As well, natural language search allow the user to type a question in the same form one would ask it to a human. A site like this would be ask.com.
The usefulness of a search engine depends on the relevance of the result set it gives back. While there may be millions of web pages that include a particular word or phrase, some pages may be more relevant, popular, or authoritative than others. Most search engines employ methods to rank the results to provide the "best" results first. How a search engine decides which pages are the best matches, and what order the results should be shown in, varies widely from one engine to another. The methods also change over time as Internet usage changes and new techniques evolve.
The contents of each page are then analyzed to determine how it should be indexed (i.e, words are extracted from the titles, headings, or special fields called meta tags). Data about web pages are stored in an index database for use in later queries. The purpose of an index is to allow information to be found as quickly as possible.
When you are searching in Google/Bing/Yahoo with specific keywords, the engine examines its index and provides a listing of best-matching web pages according to its criteria, usually with a short summary containing the document's title and sometimes parts of the text.
Most search engines support the use of the boolean operators AND, OR and NOT to further specify the search query. Boolean operators are for literal searches that allow the user to refine and extend the terms of the search. The engine looks for the words or phrases exactly as entered. Some search engines provide an advanced feature called proximity search which allows users to define the distance between keywords. There is also concept-based searching where the research involves using statistical analysis on pages containing the words or phrases you search for. As well, natural language search allow the user to type a question in the same form one would ask it to a human. A site like this would be ask.com.
The usefulness of a search engine depends on the relevance of the result set it gives back. While there may be millions of web pages that include a particular word or phrase, some pages may be more relevant, popular, or authoritative than others. Most search engines employ methods to rank the results to provide the "best" results first. How a search engine decides which pages are the best matches, and what order the results should be shown in, varies widely from one engine to another. The methods also change over time as Internet usage changes and new techniques evolve.
How to make your website appear in google search
Making your website appear in google search results is not a rocket science. Once you know the practical facts and standard practices by search engines - Google, Bing and Yahoo, you will have the right guidelines and expectations set in your mind and drive your IT team accordingly.
It is highly recommended to design the website which is rich in content (Content is King), pleasant, friendly, simple and more human usable. This will simply satify the web crawler requirement. The natural search engine results are called - Organic Search in search engine world.
Search Engine world uses other terms like SEO and SEM quite often. SEO stands for Search Engine Optimisation. SEM stands for Search Engine Marketing.
SEO is for optimizing your website for search engine purposes which involves preparing the right content, page title, meta tags, meta description Image ALT tags and H1 (Content Header) tags. Your Organic search will be improved by working out constantly on these SEO techniques.
SEM - Search Engine Marketing on the other hand covers more on Marketing techniques. For the Paid Search results you can use "Google Adwords" CPC -Cost Per Click and CPI - Cost Per Impression.
In addition there are more link building and other standard practices are part of SEM activities. More details about SEO and SEM activities will be covered more in my next articles.
It is highly recommended to design the website which is rich in content (Content is King), pleasant, friendly, simple and more human usable. This will simply satify the web crawler requirement. The natural search engine results are called - Organic Search in search engine world.
Search Engine world uses other terms like SEO and SEM quite often. SEO stands for Search Engine Optimisation. SEM stands for Search Engine Marketing.
SEO is for optimizing your website for search engine purposes which involves preparing the right content, page title, meta tags, meta description Image ALT tags and H1 (Content Header) tags. Your Organic search will be improved by working out constantly on these SEO techniques.
SEM - Search Engine Marketing on the other hand covers more on Marketing techniques. For the Paid Search results you can use "Google Adwords" CPC -Cost Per Click and CPI - Cost Per Impression.
In addition there are more link building and other standard practices are part of SEM activities. More details about SEO and SEM activities will be covered more in my next articles.
Subscribe to:
Posts (Atom)