Table of Links
-
Convex Relaxation Techniques for Hyperbolic SVMs
B. Solution Extraction in Relaxed Formulation
C. On Moment Sum-of-Squares Relaxation Hierarchy
E. Detailed Experimental Results
F. Robust Hyperbolic Support Vector Machine
D Platt Scaling [31]
Platt scaling [31] is a common way to calibrate binary predictions to probabilistic predictions in order to generalize binary classification to multiclass classification, which has been widely used along with SVM. The key idea is that once a separator has been trained, an additional logistic regression is fitted on scores of the predictions, which can be interpreted as the closeness to the decision boundary.
In the context of HSVM, suppose w∗ is the linear separator identified by the solver, then we find two scalars, 𝐴, 𝐵 ∈ R, with
where ∗ refers to the Minkowski product defined in Equation (1). The value of 𝐴 and 𝐵 are trained on the trained set using logistic regression with some additional empirical smoothing. For one-vs-rest training, we will then have 𝐾 sets of (𝐴, 𝐵) to train, and at the end we classify a sample to the class with the highest probability. See detailed implementation here https://home.work.caltech.edu/ htlin/program/libsvm/doc/platt.py in LIBSVM.
Authors:
(1) Sheng Yang, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA ([email protected]);
(2) Peihan Liu, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA ([email protected]);
(3) Cengiz Pehlevan, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA, Center for Brain Science, Harvard University, Cambridge, MA, and Kempner Institute for the Study of Natural and Artificial Intelligence, Harvard University, Cambridge, MA ([email protected]).
This paper is
