subject: ..high-order Co-occurrence & Search Engines Optimization (seo) [print this page] Latent Semantic Analysis is not purely about words. LSA can also be used with other objects, like for example images, events, optically recognized text and as a problem-solving technique (Latent Problem Solving Analysis, LPSA).
Latent Semantic Analysis itself is not co-occurrence. Typically, more than 95% of
word-pairs with great similarity will never appear together in a given paragraph. If these words were to cooccur in the same document, their cosine similarity is not high. But with that said, If words never co-occur, their cosine can still be high.
Latent Semantic Analysis neglects the word order. This is another problem. The solution? There is a " Syntagmatic Paradigmatic Model" which is a memory based mechanism that incorporates the word order, but it does preserve the distributional approach. It addresses the problem of word order and polysemy. It is well known that LSA not necessarily accounts terms with different meanings in different contexts or keyword sequences. The different senses and meanings of a word are not predetermined in any kind of mental lexicon, but because of the generative lexicon, they emerge in the context.
Vectors in Latent Semantic Analysis are context-free, but not the meaning. It is context-dependent. This is a problem. The solution? By combining Latent Semantic Analysis with the Construction-Integration (CI) Model of comprehension, word meanings can be made context sensitive.
by: .suraj
welcome to Insurances.net (https://www.insurances.net)