Latent Semantic Analysis is not purely about words. LSA can also be employed with other objects, like for example images, events, optically recognized text and as a issue-solving technique (Latent Issue Solving Analysis, LPSA).
Latent Semantic Analysis itself is not co-occurrence. Usually, a lot more than 95% of
word-pairs with fantastic similarity will never appear together in a given paragraph. If these words had been to cooccur in the same document, their cosine similarity is not high. But with that said, If words never co-occur, their cosine can still be high.
Latent Semantic Analysis neglects the word order. This is another dilemma. The solution? There is a ” Syntagmatic Paradigmatic Model” which is a memory based mechanism that incorporates the word order, but it does preserve the distributional approach. It addresses the problem of word order and polysemy. It is well recognized that LSA not necessarily accounts terms with distinct meanings in distinct contexts or keyword sequences. The various senses and meanings of a word are not predetermined in any type of mental lexicon, but because of the generative lexicon, they emerge in the context.
Vectors in Latent Semantic Analysis are context-free, but not the meaning. It is context-dependent. This is a difficulty. The answer? By combining Latent Semantic Analysis with the Construction-Integration (CI) Model of comprehension, word meanings can be made context sensitive.
Latent Semantic Analysis is not purely about words. LSA can also be used with other objects, like for example images, events, optically recognized text and as a difficulty-solving technique (Latent Dilemma Solving Analysis, LPSA).
Latent Semantic Analysis itself is not co-occurrence. Generally, much more than 95% of
word-pairs with great similarity will never appear together in a given paragraph. If these words had been to cooccur in the exact same document, their cosine similarity is not high. But with that said, If words never co-happen, their cosine can still be high.