How To See Your Website the Way Google Sees & Get More TrafficHexRowAdminLogin
You put the best efforts, toiled day and night, invested your savings to build the best website you could.
That makes your site too precious! It now deserves regular updates to become more productive. These updates will hit the right target if you follow this tactic-
Just look at your site from Google’s perspective. Because the popular search engine ranks your site which determines it’s visibility. You have to know how Google works and filters the best websites. So, that you can make the right modifications to suit Google’s policies.
In a sea of millions of sites, your website is there. But Google algorithms are well equipped to find every new website in a while. Googlebot regularly scans the web, analyses new sites, reads the information stored in them. And finally on the basis of many factors, the newly discovered sites are indexed.
Now, by taking some measures, you can help Googlebot to quickly notice your site. How? Let us see…
Generate your own site map
A sitemap explains Google about your site architecture, means how different webpages have been logically organised.
You can get it by installing the WordPress plugin called Google Sitemap Generator. xml-sitemap.com also builds site map for all kind of sites. After creating a sitemap file, add it to your root directory.
Google Webmaster Tools
This tool is a must for the website builders. It acts like a trouble shooter for your site.
When you sign up and submit your site on this tool, it gets quick recognition from Google. Here too you need to add the previously generated site map.
Click on your site >> Crawl >> Sitemap >> Add/Test Sitemap.
Request Google to index your site
Though this step is not mandatory, you can find some more time to complete it. Go to the Webmaster Tool URL Submission Page, and send the request.
You gained the attention of Google. Now it will assess the quality of your site. Let us see the detailed procedure.
Google will not index your robots.txt files
Robots.txt files have a unique code that does not allow Google to index a specific content.
Let us understand this in a simple way. Google is now scanning your entire site. It is looking for the robots.txt files. Suppose it found a file. It will look something like this –
Disallow: /traffic system/
Google will not index the elements written after disallow. But if there is nothing written after disallow, Google will rank everything.
It is a great feature if you don’t want some webpages to feature in search results. Like if you have some duplicate content, that is causing you penalty, you can create a robots.txt and hide it.
To find if your CMS has already created some robots.txt file or not:
Type your URL on search bar, add /robots.txt to it. And search. You will be directed to pages having such files if you have any.
Or check the dashboard of Google Webmasters Tools. In the crawl errors section, you can check if there are robots.txt files.
Google scans your title tag
The page title of your webpage is the “most important on-page SEO element”. While viewing the HTML code, you will find it enclosed between two <title> signs.
What things to keep in mind?
- Google will take 65 characters from your page title. It will be featured in the SERP. This tagline must depict your brand in one line. It should be cleverly written so that people click on it.
- You should have unique headings for each webpage. This shows that you have distinct information in every section of your site.
- The title shouldn’t be entirely keyword stuffed. It should look natural and symbolic of your brand value. A keyword oriented title is a big no-no.
Next Google considers your description
You must have seen that below the page titles, there is a brief description about the page. It is too a metadata, enclosed in code with two <description> tags.
But it is not considered as a SEO ranking criteria. That means, if you stuffed it with keywords, you will not get any increment in site ranking.
Still, this description is precious for your website. It is like the first advertisement which people will read before clicking on it. Keyword stuffing is a bad idea, because commercial words will never touch and move people. So, well- crafted descriptions can boost your CTR.
An ideal description should be around 160 words. And yes create unique descriptions for each of your webpages.
Google will look at the alt tags of your images
Images are a significant part of your site. It also contributes extensively to the site ranking.
Though Google can’t directly look at images, it analyses the alt tags. Alt tags are the specific HTML codes for each image. They read like this-
<img src= “http//www.exmaple.com/picture.png”alt= “Keyword Phrase”>
You have to only place a short description for each image in place of keyword phrase. The alt tag must be relevant using specific keywords. For e-commerce sites the tag should include model numbers as well.
Automatically generated alt tags are vague and don’t give the right information to search engine. So, go to HTML code by yourself and modify the tags.
And yes check if any robots.txt file is disallowing Google to index the images.
Finally Google scans your content
Serving powerful and valuable content is the aim of every site. Google algorithm has not limited the content quantity. Rather, it will be impressed if your site is enriched with content. Be careful that the content is not duplicated from elsewhere. It must be useful to the audience in some way.
Google also rewards the active sites. Those sites which publish new content in every week.
Now you will agree that Google has a different viewpoint than you. You have to meet Google’s expectations to get ranked higher. So, change that old perspective and observe your site from a fresh angle. Follow the tips and make your site more productive!