0

What are the best algorithms for Word-Sense-Disambiguation

I read a lot of posts, and each one proves in a research document that a specific algorithm is the best, this is very confusing.

I just come up with 2 realizations 1-Lesk Algorithm is deprecated, 2-Adapted Lesk is good but not the best

Please if anybody based on his (Experience) know any other good algorithm that give accuracy up to say 70% or more please mention it . and if there's a link to any Pseudo Code for the algorithm it'll be great, I'll try to implement it in Python or Java .

  • possible duplicate of [How to measure similarity between sentences](http://stackoverflow.com/questions/9354592/how-to-measure-similarity-between-sentences) – wim Feb 20 '12 at 02:58
  • http://stackoverflow.com/questions/4613773/anyone-know-of-some-good-word-sense-disambiguation-software/8808962#8808962 – alvas Apr 05 '13 at 07:11

2 Answers2

1

Well, WSD is an open problem (since it's language... and AI...), so currently each of those claims are in some sense valid. If you are engaged in a domain-specific project, I think you'd be best served by a statistical method (Support Vector Machines) if you can find a proper corpus. Personally, if you're using python, unless you're attempting to do some significant original research, I think you should just use the NLTK module to accomplish whatever you're trying to do.

jcc333
  • 719
  • 4
  • 15
1

This question is too vague, there is no best algorithm in "general" : that will depend on your problem, your data etc..

What I can suggest is to read some books on Natural Language Processing (NLP)

jtremblay
  • 11
  • 1
  • I did so I'm having issues with Lesk Algorithm, please let me know if you can help in this . . question: [link]http://stackoverflow.com/questions/9367368/how-to-compare-2-sentences-with-different-sizes-of-words-using-lesk-or-any-simil –  Feb 20 '12 at 19:44