Abstract
The present paper concentrates on the issue of feature selection for unsupervised word sense disambiguation (WSD) performed with an underlying Naïve Bayes model. It introduces dependency-based feature selection which, to our knowledge, is used for the first time in conjunction with the Naïve Bayes model acting as clustering technique. Construction of the dependency-based semantic space required for the proposed task is discussed. The resulting disambiguation method, representing an extension of the method introduced in [15], lies at the border between unsupervised and knowledge-based techniques. Syntactic knowledge provided by dependency relations (and exemplified in the case of adjectives) is hereby compared to semantic knowledge offered by the semantic network WordNet (and examined in [15]). Our conclusion is that the Naïve Bayes model reacts well in the presence of syntactic knowledge of this type and that dependency-based feature selection is a reliable alternative to the WordNet-based semantic one.
Keywords
Get full access to this article
View all access options for this article.
