Family of Regression Techniques

Family of Regression, that i have listed out to study

  1. Hedonic Regression
  2. Linear Regression - OLS
    1. When 'Y' dependent variable is an ordinal ratio, which means that there is some unit increase and can go less that zero by not negative numbers.
    2. When 'Y' is normally distributed
    3. When multiple relationships define an outcome
    4. Using OLS we get;
      1. We get the estimate for each independent variable along with the Estimate is the Standard Error, T-statistic and P-value
      2. The Standard Error should be as small as possible
      3. The t-value should be large to indicate relationship between IV and DV
      4. The p-value should be very small so that we can infer the relationship is not by chance
    5. Fit Statistics - which one to use depends on the problem we are trying to solve
      1. R value and Adjusted-R Value
      2. RMSE
      3. AIC - Akaike Information Criteria (the lower the better and more parsimonious the model)
      4. MAE - Mean Absolute Error
      5. MAPE - Mean Absolute Percentage Error
      6. Min-Max Accuracy
  3. 2 Stage Least Squares Regression (2SLS) - This technique is the extension of the OLS method. It is used when the dependent variable's error terms are correlated with the independent variables. 
  4. Spline Regression
  5. Multi Variate Adaptive Regression Splines (MARS)
  6. Polynomial Regression 
  7. Generalized Linear Regression and Extensions
  8. General Additive Models Local Regression is basically Generalized Linear Regression with Smoothing. It has non-linear smoothing plus other co-variates.
  9. Vector Generalized Linear and Additive Models
  10. Ordinal Regression
  11. Survival Analysis - Non Negative Regression (Right Censoring)
  12. Probit Regression
  13. Quantile Regression
  14. Poisson Regression
  15. Stepwise Regression
  16. Least Absolute Shrinkage and Selection Operator (LASSO) Regression - L1 regularization
  17. Ridge Regression - L2 regularization
  18. Elastic Net Regression - has both L1 and L2 regularization
  19. Support Vector Regression - Decision Tree Regression - Random Forest Regression 
  20. Logistic Regression
  21. PLS: Partial least squares or projection to latent structures 
  22. Nonlinear regression
  23. Flexible Regression and Smoothing
  24. Bayesian Linear Regression
  25. Principle Component Regression
  26. Locally Weighted Regression (LWL)
  27. Least Angled Regression (LARS)
  28. Neural Net Regression
  29. Gradient Descent Regression
  30. Locally Estimated Scatterplot Smoothing Regression - LOESS Regression (similar to K-NN Regression)
  31. K-NN: K Nearest Neighbor Regression
  32. Zero Inflated Poisson Regression
  33. Isotonic Regression
  34. Nearly-Isotonic Regression
  35. Censored Regression - Using Tobit Model
  36. SoftMax Regression
  37. Sliced inverse regression (SIR) - is a tool for dimension reduction in the field of multivariate statistics