Search Engine Optimization for quality sites

Optimizing web pages are not a one time job. It requires continuous study and understanding requirements of visitors and to deliver better user experience. In this regard here are some of the focus areas to implement and achieve better results. These areas are not basic on page Search engine optimization techniques and focused towards advance concepts. The ideas presented here are some of the implemented techniques which this site has already used.

Avoid Boiler Plates

Unique content should be there in every page. Repetitive words or paragraphs are to be avoided. We will have common menus, common messages and content in all pages, try to keep such common text as less as possible. Except some common text like copy right information and other stuffs the content should be as unique as possible. A good practice is to keep all common text in a separate file and provide links to this file where ever it is required.

Internal linking of pages

How many links can we keep at different levels within our site? Is there any point in linking all the pages from our home page? How many maximum links we can keep in a page? Just try to understand where your visitors are entering to your site. If you are a content driven site and not a popular job portal or travel site then more than two third of your visitors will be entering to your site from internal pages. Just try to calculate what is your percentage of entry to home page and internal pages. In that case do you need all those links at home page which points to each and every corner of the site ? So an navigational link system is to be in place for different areas of the site. .

Related areas

Visitors are likely to visit related areas most than any unrelated or less related links. So a page on blue widgets should have links to price, availability, how to select and other related areas than on how this site works or on web programming. Signup for google analytics and check the page on Site Overlay ( Content > Site Overlay ). From your site layout you can understand which links visitors are visiting. It is observed that from a page links which explains or provide more information around the main subject area of the page is likely to get more clicks than other links. In a travel site most of the visitors of a city clicks a nearby tourist place, however the same tourist page gets less visitor from another city within the same state. So the site need to develop content focusing near by areas or related areas of the main content. If we are talking about Mangos then it is better to say about varieties of Mangos and about mango seasons than saying about Orange. This is one of the reasons why number of pages in a directory focusing on one subject has some influencing factor in page ranking.

More sites or More pages on same site.

For example if you have a site on programming using ASP then it is better to have one more section in same site dedicated towards HTML or Web design. A big portal is better than having small sites.

Minimum Number of pages

It is not number of pages, it is number of Quality pages that matters. It is often seen ranking fluctuates and webmasters complains about loss of rank or organic hits from search engines. Let us first develop a site of around one hundred pages of quality content and then think about maintaining rank in search engines. We need to be better than others to get good rank. Search engines won't send visitor to a site with few pages of content.

Duplicate pages

How do you link to your home page from an internal page ? Your link can be like https://www.sitename.com/index.htm or it can be https://www.sitename.com/ . Since in both the cases same content or the same page will be displayed, so it became an issue of duplicate URLs or two URLs delivering same content. There is no penalty associated for duplicate content but it can be avoided by linking to home page without using index.htm page. So within the site if we maintain a uniform way of home page linking without using index.htm page at the end of the URL we can avoid the duplicate page issue. Another point is the other sites which will point back links to us are more likely use https://www.sitename.com than https://www.sitename.com/index.htm .

What happens if we have links from other sites pointing without using www as sitename.com ( not as www.sitename.com )? These links are not in our control so better we handle it at our end than asking each site owner to change the link. If you are in Apache server you can do 301 (permanent redirect) by using .htaccess file. All the requests for sitename.com will be redirected to www.sitename.com by adding this code. Please do a complete test before using .htaccess file with this code.


Options +FollowSymLinks
RewriteEngine on 
RewriteCond %{HTTP_HOST} ^sitename.com [NC] 
RewriteRule ^(.*)$ https://www.sitename.com/$1 [L,R=301]
Save the file as .htaccess and upload to your server root by ASCII mode. Go on testing….to know is it working or not, take the help of your host if you find any thing wrong.

preferred URL by using rel='canonical'

We can arrive at same page by using any one of this URL
www.example.com
www.example.com/index.htm
example.com
example.com/index.htm
We can't control the way visitors will use the URL but we can tell to search engines that the preferred URL is www.example.com for all the above address. For this purpose there is a canonical tag kept inside head tags. Here is the example.
<link rel="canonical" href="https://www.plus2net.com">

Paging of records

There is another chance of duplicate page where paging script is used to break large number of records in different pages. The first page is same for when we access them like this page.php?start=0 or like page.php . Similar situation if possible should be avoided. Another reason is printer friendly pages. Whenever such duplicate pages comes up engines finds out one its own way the best or original content page and ignore the duplicate one.

So it is better we specify a printer friendly page as lower in priority by using tag rel=nofollow.
In google webmaster tools we can use Your site on the web > Internal links to findout duplicate page issues. Read Webmaster tools -internal links section for more details.

Google webmaster tools

By using google webmaster tools a lot of information can be found out about the site. Try to keep an eye on crawl errors as reported inside webmaster tools. Knowing how the search engine robots see the site gives lot of information to the webmaster. Google can't give a site specific information to the public so this is a the place where webmaster can get any site specific issues from Google. The actual back link as seen by google can be monitored. However the back link reported here are bit of old data and there is a lag period between actual shown in list and what is considered for ranking in the algorithm. From here webmaster can submit a re-inclusion request or can submit report / complaint against sites using paid links or using back hat SEO techniques.

The best part of webmaster tools is content analysis. You can fix many errors by checking the list given by google. Duplicate titles, short Meta description, page not found etc are going to give you lot of information or opportunity to improve your site.

Site Map

Keep a site map for the visitors and one more for the search engine robots. Within the site search facility available from google and from other vender the requirement of a sitemap for visitors is reducing. The sitemap for robots has to be written in XML and there are instruction available on how to develop sitemap pages in XML. This xml sitemaps helps robots to understand the structure of the site and importance it has to give to pages relative to other pages within the site. This has nothing to do with page rank of the page but it helps robots to index all the relevant page of the site.

Keep updating

For maintaining a high quality site we need to keep on improving as for the requirement so sites get automatically updated and that gives a sign to engines that this is not a dead site.

The frequency of updating depends on area of focus and size of the site. When ever we add a new page we can add the link ( of the new page ) to all internal pages which are the related pages within the site. If your site is a big one then do a site search to find out the internal pages which uses the keywords of the new page and provide link to the new page from these old pages.

Age of the site

It is agreed that age of the site is a factor in google algorithm. The main reason for this is google gives importance to the factors which can't be manipulated by webmasters. So the age is one of such thing which can't be changed. Frequency of updating is also important so by just keeping a dead site for a long time is not a great idea. However it is advisable to keep the domain ready and add some minor content or update the pages at some interval even if you are not yet ready to fully lunch a site. It does not cost much to just maintain a website.

After Panda update google has published a guideline for building high quality sites.

If your web site is affected by google Core update then here is the guide from Google.

Be the first to post comment on this article :

plus2net.com



Search Engine Optimization and google articles



We use cookies to improve your browsing experience. . Learn more
HTML MySQL PHP JavaScript ASP Photoshop Articles FORUM . Contact us
©2000-2024 plus2net.com All rights reserved worldwide Privacy Policy Disclaimer