:: Experimental :: Fit a parametric survival regression model named accelerated failure time (AFT) model (https://en.wikipedia.org/wiki/Accelerated_failure_time_model) based on the Weibull distribution of the survival time.
:: Experimental :: Model produced by AFTSurvivalRegression.
:: Experimental :: Model produced by AFTSurvivalRegression.
:: Experimental :: Decision tree model for regression.
:: Experimental :: Decision tree model for regression. It supports both continuous and categorical features.
:: Experimental :: Decision tree learning algorithm for regression.
:: Experimental :: Decision tree learning algorithm for regression. It supports both continuous and categorical features.
:: Experimental ::
:: Experimental ::
Gradient-Boosted Trees (GBTs) model for regression. It supports both continuous and categorical features.
:: Experimental :: Gradient-Boosted Trees (GBTs) learning algorithm for regression.
:: Experimental :: Gradient-Boosted Trees (GBTs) learning algorithm for regression. It supports both continuous and categorical features.
The implementation is based upon: J.H. Friedman. "Stochastic Gradient Boosting." 1999.
Notes on Gradient Boosting vs. TreeBoost:
:: Experimental ::
:: Experimental ::
Fit a Generalized Linear Model (https://en.wikipedia.org/wiki/Generalized_linear_model) specified by giving a symbolic description of the linear predictor (link function) and a description of the error distribution (family). It supports "gaussian", "binomial", "poisson" and "gamma" as family. Valid link functions for each family is listed below. The first link function of each family is the default one.
:: Experimental :: Model produced by GeneralizedLinearRegression.
:: Experimental :: Model produced by GeneralizedLinearRegression.
:: Experimental :: Summary of GeneralizedLinearRegression model and predictions.
:: Experimental :: Summary of GeneralizedLinearRegression model and predictions.
:: Experimental :: Summary of GeneralizedLinearRegression fitting and model.
:: Experimental :: Summary of GeneralizedLinearRegression fitting and model.
:: Experimental :: Isotonic regression.
:: Experimental :: Isotonic regression.
Currently implemented using parallelized pool adjacent violators algorithm. Only univariate (single feature) algorithm supported.
:: Experimental :: Model fitted by IsotonicRegression.
:: Experimental :: Model fitted by IsotonicRegression. Predicts using a piecewise linear function.
For detailed rules see org.apache.spark.mllib.regression.IsotonicRegressionModel.predict().
:: Experimental :: Linear regression.
:: Experimental :: Linear regression.
The learning objective is to minimize the squared error, with regularization. The specific squared error loss function used is: L = 1/2n ||A coefficients - y||2
This supports multiple types of regularization:
:: Experimental :: Model produced by LinearRegression.
:: Experimental :: Model produced by LinearRegression.
:: Experimental :: Linear regression results evaluated on a dataset.
:: Experimental :: Linear regression results evaluated on a dataset.
:: Experimental :: Linear regression training results.
:: Experimental :: Linear regression training results. Currently, the training summary ignores the training weights except for the objective trace.
:: Experimental :: Random Forest model for regression.
:: Experimental :: Random Forest model for regression. It supports both continuous and categorical features.
:: Experimental :: Random Forest learning algorithm for regression.
:: Experimental :: Random Forest learning algorithm for regression. It supports both continuous and categorical features.
:: DeveloperApi ::
:: DeveloperApi ::
Model produced by a Regressor.
Type of input features. E.g., org.apache.spark.mllib.linalg.Vector
Concrete Model type.
:: Experimental :: Fit a parametric survival regression model named accelerated failure time (AFT) model (https://en.wikipedia.org/wiki/Accelerated_failure_time_model) based on the Weibull distribution of the survival time.