March 2013

Saturday, March 30, 2013

Listing all your web pages available for users is easy you can do this by using any online tool that will convert your entire archive into html link listing that is called html sitemap -easy for any user to navigate through your site – but if you want all your web pages make available to search engine so it could crawl your site so then you need a xml sitemap.

Xml sitemap is type of sitemap used for webpages to register them for search engine, different types of sitemap  are there to entertain different content – video site map for video etc.

There are some other programs can be used for generating sitemaps online for your audience- you can write code for generating sitemap for your audience online.

There are different ways through  which you can intimate Google to crawl your site in the form of sitemap– one is mentioned above using xml sitemap as a file means you need to create xml file where you will list down all your web pages in that file and upload on your root web.

This can be achieved  by many online services available on internet – can facilitate you to create sitemap for your own site – and just upload that file.


If you have created sitemap in a format that is xml so you should be having the following URL:
example.com/sitemap.xml
Now you can tell Google this is sitemap.xml entry in robots.txt and and let Google crawl your all content.
Robots.txt is a file containing xml sitemap entries for Google and from then on Google reads that file.
Ideally this file can be uploaded in your root web or if you are using blogger blog so upload it there and check out the change in webmaster tool.

Know-how and difficulties of Multiple xml sitemap:
If you are familiar with Google Web Master tool and having only 26 listed url entries though you have more than humdreds posts so you can fix this issue now.

Basically sitemap are used to optimize sites if you have blog on blogger and have more than 100 posts or articles when you login Google webmaster area and you see only 26 pages are there and indexed too as you have more than 100 posts inside your blog but Google webmaster tool will show you only 26 entries.

How to create and submit xml sitemap:
If you are managing blog on blogger so you can create sitemap there too, just login your webmaster tool and click optimization and click add sitemap button and enter atom.xml  this is another format accepted by Google as sitemap – this is feed basically.
So finally you come up with creating a sitemap for blogger by just adding a parameter called atom.xml
You will probably face another problem when you will see only 26 pages are index after submitting this sitemap although you have more than 100 pages, so just type the following parameter to get all your pages listed and indexed.


Submitting xml sitemap for Blogger with more than 26 URL entries:

You are sure your posts are more than 100 and pages has been listed only 26 by Google so login Google webmaster tool area and add the following parameter or create a new sitemap as the following entry.
tip: Create a new xml sitemap inside your webmaster tool for your blogger blog and add the following parameter while adding a sitemap in Google webmaster.

atom.xml?redirect=false&start-index=1&max-results=100
Final site map entry should look like:
yoursite.com/atom.xml?redirect=false&start-index=1&max-results=100

You are telling this to Google that you have 100 pages in your xml sitemap and please index them and now  if you have 200 pages so enter another sitemap starting from 101.
atom.xml?redirect=false&start-index=101&max-results=200

Final site map entry:
yoursite.com/atom.xml?redirect=false&start-index=101&max-results=200

So remember these steps|:
  • Creating xml site map in Google webmaster
  • Submitting xml sitemap in Google Webmaster
As of now you are clear with the type of sitemaps:
  • Xml sitemap 
  • Atom.xml sitemap 
  • RSS 2.0 
RSS 2.0 another type of sitemap in the form of  feed which is easy to configure inside your blogger, it's one step configuration in blogger.

You also have very clear understanding - how to configure your sitemap in Google webmaster for more than 100 pages and indexed them too.


Thursday, March 21, 2013

Millions lines of code always makes your application slow and messy specially if you are working on internet and designed and developed application using any technology but if you come across million lines of code with php so definitely you are in a fix now.
If you look at social sites or any web based application usually been written in PHP but when it goes on long routes then hurdle starts – mean more code then more hurdle  on web server and eventually scale out of web server more expense.

Your code written in PHP and  becoming more complex and business logic taking much time to process simple or complex functionality you also want to retain some of fantastic PHP features and on the other hand your application is becoming very slow- you need to find some solution for that.

PHP is a scripting language and easy to understand because it has resemblance with java and c++ even function variable declaration style all are same in all these  three platform.

What is HipHop for PHP:
Hiphop for php is a technology that describe number of other technologies inside in it including HPSPc, hphpi, HPHPD AND hhvm.  Technology is developed by Facebook to boost up their application performance. Now it's released as an open source so you could also use it and convert your slow application into fast application.

HipHop as open-source:
HipHop for PHP is open source project can be used by anyone has desire to boost up php application performance in other word , anyone having Facebook like application wanted to speed up the application performance – the choice would be HipHop for PHP.

You can work with your project and start scaling your application into c++ code and keep enjoying with PHP – this is the basic idea of this project.

Browse github site an check HipHop project only available for Ubuntu right now. You can find the installation procedure and the entire configuration details.

Friday, March 15, 2013

 If you come across with a situation when you see duplicate URL entries of your site pages on Google  search engine and you do not know how to fix them so here we have a rundown as to how you can fix them, basically these duplicate URLs could be from  your session url or archive pages.
In other words you can call them duplicate content url that exits in your own site in the form of your site url and date archive format and sometimes your site url and session entries and sometime the same url with different parameter in url showing the same content with different url entries in search engine you can track them by just typing a command on Google:

Site:yoursitname.com  and enter then you will see your site url entries with different parameters showing same pages.

If you are managing your blog with blogspot/blogger and creating post and side by side you see there are number of archive pages are being indexed against those new posts so search engine probably do not like these duplicate archive pages.
Blogger gives you option to remove those archive pages from Google index, basically you need to tell Google do not index them because these are the archive pages which are duplicate URL.

Fix the duplicate content URL entries from blogger:
If you are using old interface of blogspot/blogger then go to setting>> archive and set no archive and if you are using new interface of blogspot /blogger then you need to write some code and telling  Google not to index these archive pages which is pretty straight forward approach. you can write this code with the old interface as well - it's just a matter of code which is mentioned below - just copy code snippet and you past it in template.

Sample duplicate archive pages entries:
Some of sample pages are your url ending with date and archives or some special character like question mark etc.

To fix them just point your location to blogger template html section and find head> tag and add following code after head tag.

<b:if cond='data:blog.pageType == &quot;archive&quot;'>
<meta content='noindex,noarchive' name='robots'/>
</b:if>

This can happen with your own website where you see duplicate url of your session URL
Once you fix this issue then Google will show you exact  number of published post in its index.

Once Google bot will crawl your site then it will show you exact number of html of web pages or post no extra pages from your own site having same content.
you will see boost in ranking after this classic SEO fix.

Some other attributes you can use for noindex:
Following sample will prevent all cached linked to appear on MSN search result:
<meta name="msnbot" content="noindex">

If you claim you know SEO and on the other hand you do not know 3 things- Google Panda, Google Penguin and Matt Cutts, this means you are not familiar with SEO techniques, so we are going to tell you how you fix and cop up with the scenario where you are being affected by these 2 updates.
Either you dealing with small business or working with large vendors you need to know the strategies to be employed to handle such impact on your site so anyone could not loose ranking by these updates and understand how to improve Google ranking.

Here is rundown as to how you can take such measure so you business could not affect by penguin and panda.
Little background about  Panda and Penguin updates – these are two algo  update in  Google which can affect your website ranking in Organic search result.

Panda is algorithmic  change which will affect sites if site does not employ full information or site having low content. This has already been released by Google throughout U.S affecting 12%queries.

I want to know what is Penguin update:

Whereas penguin algo is going to focus on websites having content against Google webmaster guidelines which means site with keyword stuffing will be affected by this update.
If you are in link buying network so penguin is going to come at your door, if you have used keyword excessively then penguin is going to come to your home and let it build a natural profile of your website links otherwise penguin is going to visit your site your site will be out of the universe.
 Panda will affect the sites having content farm, high click bounce rate means people are visiting your site and going to other site and nobody have taken interest in your site and moved somewhere.
If your site have very little content then panda is going to give you lesson – very simple you will be pushed back and may not by found on Google search result.

Tips to improve ranking of sites affected by Panda or Penguin:

Do not worry if you are under serious attack by these update with the clean mindset just analyze your pages using a tool called Google analytics and check which pages has low page-views either you have written no value content or just update them with valuable content – the only solution to your problem.

Penguin 4 updates:
And last but not least Matt Cutts – if you do know anything about this person so you do not know SEO we call him a spider man of the Search world. So stay connected with him on social media and read his blog update as he announces anything so follow the instruction.
Matt Cutts has announced there is another update of algo called Penguin they probably call it Penguin 4- when it will be rolled out then will affect many website involved in spamming.


Penguin 4 may be releasing on Friday or Monday 2013 affecting many sites specially spammer sites so user could have better experience while searching with Google.

Please stay tuned to our site and browse your favorite content and leave your comments if you need any help.

Wednesday, March 6, 2013

It’s very cumbersome for people  to remember web addresses while browsing, solution to this problem was bookmark the site so next time when you want to visit the same resource just click the bookmarked entry,  but what if you want the same resources address in your office or somewhere else out side- again you have stored all web addresses  in your home’s browser so there was need of bookmarking that should be online or online bookmarking.
Then many organization started social bookmarking as well, but if you come across a Google product called Google bookmark is quite fascinating tough it’s not competing with other huge social book marking site but giving you a feature to store your bookmark on line so you access web addresses from anywhere where if you have stored them online.
Google has introduced a technology called Google bookmark not a new service by Google but ultimately can be used in sophisticated way.

Using Google toolbar for online bookmarking :

If you are using Google toolbar so you will find a button on toolbar "bookmarks" – this button will store your bookmark online – as you know this toolbar is available for IE - not available for latest Firefox browser so you can use this functionality using IE and you can add your favorite site online using Google bookmark, if you have already bookmark your site on your hard drive so can import all bookmark online using the same button click on Google Toolbar->> Bookmarks->> Import IE Favorites `from there follow the steps and define label for your online bookmarks you are done your all bookmarks are available on Google bookmark site.
           
Tip: you can drag all your bookmark online by just dragging all your off line bookmark onto Google bookmark online bookmark space.

Online Bookmarking with fun:

Just want a handy mechanism for storing all your favorite web addresses online just drag the following button(visit the site below) on your bookmark menu and drop it there, visit any site that you want to remember and store it in bookmark just click on that button name “Google bookmark” and provide your Google credentials and done – you are storing that web address online.

Bookmark button:
https://support.google.com/chrome/bin/answer.py?hl=en&answer=100215&topic=14680&ctx=topic

You have imported and store all your favorits sites online not on your hard drives using this technology.

Why you need Google bookmark online?

If your pc hit by any malware and you are unable to recover your favorites sites you had saved offline on your drive -  so your bookmark will be lost instead- if you have saved all your bookmark online using Google bookmark so you access your bookmark from any pc and from anywhere outside.

There are other bookmarking services online available and delicous and digg is a kind of social book marking sites example.

Friday, March 1, 2013

Google is not only the search giant but also been penetrating in others fields as you could see its product on black navigational bar once you browse Google.com and can navigate somewhere else and select other offered services from Google.

Google is making a move and testing a new navigational bar probably a rich matrix style  icon interface for users to view and navigate through them – this idea might be taken from chrome style navigation where you see matrix style icons displaying link on your right hand corner -so here you probably see such icon for Google product pointing you to the gmail, video and other Google product and it will be unfolded once you click on it.

Previous Experience of Google Navigation Bar:
Basically this navigational mechanism will take less space on screen and will show you more icons/links once you click on matrix style grey icon then the entire product line will show up and unfold the entire products on the same screen.

previously some people has been annoyed by representing other Google products in different style but I think Google has decided to represent the entire product line in sophisticated manner this time, and probably once they will finish its test phase then people will like it.