Content – Heart and Soul of SEO & Your Website



Website Content Planning | What Content Google Wants???

Content – Heart of SEOContent – Heart of SEO

What do you mean by content? Content is basically the heart of your webpage and it is the most important ON-page factor for the SEO.

Content is the matter or the data which you put on the webpage for the visitors to read and make your visitor stay on your website for longer time.

Remember the Golden rule; the webpage is not for the crawler but for the human audience i.e. the visitors. If you make your content sound organic to the human visitors, it will not seem like spam to the search engine crawler either.

Let us discuss the factors related to the content for the SEO of the webpage,

 Keyword Specific Content : When we speak of keyword specific content, it means that our content should be related to the keywords which we have targeted. Suppose we have our keyword “herbal products” , than we should try to make herbal products as the theme of the page content and make use of the keyword wherever appropriate.

Keyword specific content will help the search engine crawler understand that our page is targeted for the specific keyword, and it will index our webpage better for that keyword.

quality_web_content_writers• Fresh Content: This is what will bring back the visitors again to your website, or in other words increase the percentage of returning visitors. Apart from this, when we have fresh content or update content regularly, the search engine crawler indexes page much quickly and improves the index every time.

For frequent addition of fresh content on our webpage, we can make use of articles which will be updated regularly on the webpage and would bring back the visitors as they will be obtaining new content frequently.

• No Duplication: Duplicate content refers to multiple versions of the same content that exists on different pages, either within one domain or across different domains

Content on the webpage should always be unique. Duplicate content present on the webpage, decreases the index and also increases the chances of the webpage being treated as spam and it may also get removed from the index of the search engine.

There are duplicate content filters which are algorithms designed to compare one page against another. If the filter considers two or more pages to be substantially similar, it simply keeps the more trusted one in primary index, while moving others in supplemental index.

• No Keyword Stuffing: As we discussed earlier, we should make use of the content such that it sounds organic to the human visitors. When the content is return just for the sake of keywords and stuffing is done unnecessarily, it will have two negative results, firstly the content will not be visitor friendly therefore the visitor would navigate away from the page and might not return again, secondly keyword-stuffed content are invitation for the search engine spider to consider the webpage as spam.

Remember the following,

Keyword-Stuffed Content – Harmful

Sound Organic Approach – Good

Keyword- Focused Content – Clever Approach

• Keyword Density : Keyword Density is a measure of percentage of keywords to the general number of words on a webpage.The results on the Search Engine Result Page (SERP) are ranked in part based on the percentage of words on a webpage that are similar to words in the query i.e. the word which the visitor types in the search box of search engine.The average keyword density of a particular keyword on the webpage should be between 3% to 4% .One tool which we can use to find out the keyword density is “SeoQuake” , we will discuss this in detail in the following chapters.

• Code-to-Content Ratio: Now we will move into more technical terms related to placement of content. Everyone knows what we mean by ratio, it is comparison of one element to another. When we discuss code-to-content ratio, it is the amount of code with respect to the actual viewable amount of content on the webpage.

When we code our webpage, always try to give the search spider/crawler the shortest route to our textual content wherever possible.We should always try to maintain low code-to-content ratio, the reason behind it is that most of the times, the SE crawlers do not crawl the whole page, it generally reads not more than 100 KB of the page.

In the following topic we will discuss various method to lower the code-to-content ratio.

• How to maintain code-to-content ratio? : Now we know what code-to-content ratio is, let us see how to achieve the best code-to-content ratio. When we right click on any web page and select view page source, we can see the exact code and the content of the particular web page.Our aim is to reduce the code wherever possible. Search Engine Crawlers do not crawl javascripts, CSS, animation etc. when we include all these things the code on the page increases.

The javascript code should not be present on the webpage, instead we should create a “.js” file of the script and just call it from the webpage in the head or the body section. This will reduce the code-to-content ratio considerably and also maintain the scripts for the webpage.

CSS i.e. Cascading Style Sheets are very popular for improving the looks and adding effects for the webpage. There are three ways we can include CSS and use for the webpage viz, inline css, embedded css and external css. For reducing the code-to-content ratio, external CSS is the best option. External CSS is basically creating a “.css” file and then calling the file from the webpage. CSS is the perfect means of reducing code-to-content ratio, lowering HTML file size and compromising between clean visual design and clean code of spiders.

Check Also

How to Earn Money Online by Top 10 Article Writing Sites?

If you are a good author, you can earn money by composing articles for individuals with ...

© Copyrights 2014. All rights are reserved www.latestcontents.com