1Institute for Computer Science and Control (SZTAKI), Eötvös Loránd Research Network (ELKH), Laboratory of Parallel and Distributed Systems, Hungary
2Óbuda University, John von Neumann Faculty of Informatics, Hungary
*Corresponding Author: Ákos Hajnal, Institute for Computer Science and Control (SZTAKI), Eötvös Loránd Research Network (ELKH), Laboratory of Parallel and Distributed Systems, Hungary.
Received: September 23, 2022; Published: November 08, 2022
This paper presents a novel approach to train classifiers implemented as a shallow neural network. The proposed solution is based on the original Perceptron algorithm but extends to the multi-hyperplane case. Consequently, it allows of solving not only linearly separable problems. Besides simplicity, the advantage of the method is its tolerance to imbalanced data, which can occur in practice. The applicability of the method has been demonstrated on several artificial and on real-life datasets.
Keywords: Machine Learning; Artificial Neural Networks; Shallow Neural Network; Classification; Perceptron; Multi-Hyperplane Separation; Mistake-Driven Algorithm
Citation: Ákos Hajnal. “A Multi-hyperplane Separation Method to Train Shallow Classifiers". Acta Scientific Computer Sciences 4.12 (2022): 02-07.
Copyright: © 2022 Ákos Hajnal. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.