Latent Semantic Indexing, otherwise known as LISI, is an integral part of Google’s search engine algorithm that determines not only the listing position of your web page in the Google search engine results pages, but also whether or not it will be listed at all.
Right off, let me first say that nobody can produce LSI-compliant content: there is no such thing, because LSI is a concept connected to the analysis of text strings in order to determine their true meaning. It is properly referred to as latent semantic analysis (LSA), although Google uses the term latent semantic indexing because it is being used to index your web pages.
However, you can use the general sense of the concept in respect of the way you write the content of your web pages, and your use of contextual relevance to the search term being used by Google users seeking specific information. In explaining how this is done, therefore, I do so with apologies to the purists of LSA who correctly claim that a web page cannot be LSI-compliant or that LSI can be used to improve your on-page SEO.
Let me explain the meaning of contextual relevance by giving you an example. It is a simple example, and perhaps contrived, but it does explain the concept, and also how you can apply it to improve the listing position of any web page you apply it to. We shall consider a Google user seeking information on ‘how to write’.
How to write what? Check up any keyword analysis tool you wish to use, and you will find that the keyword ‘how to write’ is a very popular one, and it is very easy to use it as the main keyword in a web page on writing. In fact a keyword used by you on your website is no more than a means of you informing search engines of the topic of your web page, and is hopefully the same as the search term being used by the search engine user.
However, the question is ‘how to write’ what? The visitor to your page migh be looking for help on how to write content for their web pages, magazine articles, articles for newwspapers or even how to write novels, and every one of these would need a different approach. The concept of latent semantic indexing involves examining text strings close to the keyword, and deciding the relevance of the keyword and rest of the text from that. Thus, ‘newspaper’, ‘novel’. ‘web page’, ‘submission’ for example, can all be used in latent semantic indexing to index your page in the correct category, so that the pages presented in the SERPS give the search engine users the results relevant to their inquiry.
The real customer of a serach engine is the person using it to find information – to carry out a search. You are not Google’s customer, and neither are all those advertisers using Adwords, because without the person carrying out the search Google would not exist. Google is ensuring that its users are getting the best possble service through good search results by menas of the use of latent semantic indexing.
My website offers different examples of contextual relevance, such as how the keyword ‘the history of locks’ can one of three different things that could not be distinguished by keyword-stuffing. The same is true of the term ‘German Shepherd’ another commonly used example. Are you seeking information on dogs or how Germans look after their sheep? Is an Alsatian a dog or a person?
Google had initially been using the concept of semantic analysis (‘semantic’ refers to the meaning of words) in their Adsense program, where it was used to determine the type of adverts to place on users’ web pages according to the topic of the page. However, people began making thousands from Adsense by automatically generating pages of meaningless text into which any keyword could be inserted and make sense to the reader, thus:
“Information on KW can be found all over the internet, KW being the subject of many online searches. There is a large number of websites providing information on KW, and an equally large number of people using KW as their keyword in their Google search.”
That’s just a brief example, but you could use any keyword you can think of as ‘KW’, and entire web pages would be generated by software designed simply to enter a keyword of your choice in place of KW. Many of these sites received top listings because the algorithms were predominantly keyword orientated, and endless repetition of a keyword would almost guarantee a high listing. I did it myself: I could generate 5,000 pages for Adsense from a list of 5,000 keywords just from one template into which every keyword could be inserted and still make grammatical sense. You don’t need many clicks to add up the Adsense income from that many pages.
Google stopped it all with LSI. They applied the concept of latent semantic indexing used in their Adsense program to their search engine algorithm, and overnight websites with no text related in context to the keyword were dropped. People’s income was decimated and their businesses destroyed – and probably rightly, although they hadn’t set the rules that had applied, just took advantage of them.
Thus, to be listed for ‘article writing’, you would have to make it clear what type of articles and what type of writing you were referring to. With the shepherd, you would have to mention sheep or Alsatians or use some means of making it clear what the topic of your content was. That’s LSI in action, and while most people are still unsure what latent semantic indexing really means, and use the term wrongly, if you write your content naturally you should be just fine.