Family of Regression, that i have listed out to study
- Hedonic Regression
- Linear Regression - OLS
- When 'Y' dependent variable is an ordinal ratio, which means that there is some unit increase and can go less that zero by not negative numbers.
- When 'Y' is normally distributed
- When multiple relationships define an outcome
- Using OLS we get;
- We get the estimate for each independent variable along with the Estimate is the Standard Error, T-statistic and P-value
- The Standard Error should be as small as possible
- The t-value should be large to indicate relationship between IV and DV
- The p-value should be very small so that we can infer the relationship is not by chance
- Fit Statistics - which one to use depends on the problem we are trying to solve
- R value and Adjusted-R Value
- RMSE
- AIC - Akaike Information Criteria (the lower the better and more parsimonious the model)
- MAE - Mean Absolute Error
- MAPE - Mean Absolute Percentage Error
- Min-Max Accuracy
- 2 Stage Least Squares Regression (2SLS) - This technique is the extension of the OLS method. It is used when the dependent variable's error terms are correlated with the independent variables.
- Spline Regression
- Multi Variate Adaptive Regression Splines (MARS)
- Polynomial Regression
- Generalized Linear Regression and Extensions
- General Additive Models Local Regression is basically Generalized Linear Regression with Smoothing. It has non-linear smoothing plus other co-variates.
- Vector Generalized Linear and Additive Models
- Ordinal Regression
- Survival Analysis - Non Negative Regression (Right Censoring)
- Probit Regression
- Quantile Regression
- Poisson Regression
- Stepwise Regression
- Least Absolute Shrinkage and Selection Operator (LASSO) Regression - L1 regularization
- Ridge Regression - L2 regularization
- Elastic Net Regression - has both L1 and L2 regularization
- Support Vector Regression - Decision Tree Regression - Random Forest Regression
- Logistic Regression
- PLS: Partial least squares or projection to latent structures
- Nonlinear regression
- Flexible Regression and Smoothing
- Bayesian Linear Regression
- Principle Component Regression
- Locally Weighted Regression (LWL)
- Least Angled Regression (LARS)
- Neural Net Regression
- Gradient Descent Regression
- Locally Estimated Scatterplot Smoothing Regression - LOESS Regression (similar to K-NN Regression)
- K-NN: K Nearest Neighbor Regression
- Zero Inflated Poisson Regression
- Isotonic Regression
- Nearly-Isotonic Regression
- Censored Regression - Using Tobit Model
- SoftMax Regression
- Sliced inverse regression (SIR) - is a tool for dimension reduction in the field of multivariate statistics