Method to calculate error of the base learner for the gradient boosting calculation.
Method to calculate error of the base learner for the gradient boosting calculation. Note: This method is not used by the gradient boosting algorithm but is useful for debugging purposes.
Model of the weak learner.
Training dataset: RDD of org.apache.spark.mllib.regression.LabeledPoint.
Measure of model error on data
Method to calculate the loss gradients for the gradient boosting calculation for binary classification The gradient with respect to F(x) is: - 4 y / (1 + exp(2 y F(x)))
:: DeveloperApi :: Class for log loss calculation (for classification). This uses twice the binomial negative log likelihood, called "deviance" in Friedman (1999).
The log loss is defined as: 2 log(1 + exp(-2 y F(x))) where y is a label in {-1, 1} and F(x) is the model prediction for features x.