We consider the well-studied Pattern Recognition (PR) problem of designing linear classifiers. When dealing with normally distributed classes, it is well known that the optimal Bayes classifier is linear only when the covariance matrices are equal. This was the only known condition for discriminant linearity. In a previous work, we presented the theoretical framework for optimal pairwise linear classifiers for twodimensional normally distributed random vectors. We derived the necessary and sufficient conditions that the distributions have to satisfy so as to yield the optimal linear classifier as a pair of straight lines. In this paper we extend the previous work to d-dimensional normally distributed random vectors. We provide the necessary and sufficient conditions needed so that the optimal Bayes classifier is a pair of hyperplanes. Various scenarios have been considered including one which resolves the multi-dimensional Mznsky 's paradox for the perceptron. We have also provided some three dimensional examples for all the cases, and tested the classification accuracy of the relevant pairwise linear classifier that we found. In all the cases, these linear classifiers achieve very good performance.

Additional Metadata
Series Lecture Notes in Computer Science
Citation
Rueda, L. (Luis), & Oommen, J. (2001). Resolving Minsky’s paradox : The d-dimensional normal distribution case. In Lecture Notes in Computer Science.