For my machine learning study, I tested different algorithms like SVM, SMO, Naive Bayes, Trees etc. All the algorithms resulted with low accuracy levels. In fact the highest accuracy I obtained was 46% using Naive Bayes.
Then I tried to do a feature selection. I used InfoGainAttributeEval in WEKA to do this. It ranked 7 features out of 27 I used, and then, I tried the classification with those 7 features only. But, it resulted with worse accuracy levels. The accuracy of all the algorithms other than SMO got decreased. Naive Bayes resulted with 36% of accuracy.
As I have heard and learned, feature selection is for decrease the complexity and improve the accuracy. But in my case, Why it decreased the accuracy also?