You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: scikit-learn/README.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,10 +7,10 @@ The folders included demo programs for leverage scikit-learn library to solve ta
7
7
|Algorithm|Description|Link|
8
8
|------|------|--------|
9
9
|Linear regression|Linear regression is a linear modeling to describe the relation between a scalar dependent variable y and one or more independent variables, X.|[Source Code](https://github.com/Cheng-Lin-Li/MachineLearning/blob/master/scikit-learn/LinearRegression/sklearn-LinearRegression.py)|
10
-
|Logistic regression|Aka logit regression. It is different to regression analysis. A linear probability classifier model to categorize random variable Y being 0 or 1 by given experiment data. Assumes each of categorize are independent and irrelevant alternatives. The model p(y=1\|x, b, w) = sigmoid(g(x)) where g(x)=b+wTx. The sigmoid function = 1/1+e^(-a) where a = g(x).|[Source Code](https://github.com/Cheng-Lin-Li/MachineLearning/blob/master/scikit-learn/LogisticRegression/logistic_regression.py)|
10
+
|Logistic regression|logit regression. It is different to regression analysis. A linear probability classifier model to categorize random variable Y being 0 or 1 by given experiment data. Assumes each of categorize are independent and irrelevant alternatives. The model p(y=1\|x, b, w) = sigmoid(g(x)) where g(x)=b+wTx. The sigmoid function = 1/1+e^(-a) where a = g(x).|[Source Code](https://github.com/Cheng-Lin-Li/MachineLearning/blob/master/scikit-learn/LogisticRegression/logistic_regression.py)|
11
11
|Gaussian Mixture Models (GMMs)|GMMs are among the most statistically mature methods for data clustering (and density estimation). It assumes each component generates data from a Gaussian distribution.|[Source Code](https://github.com/Cheng-Lin-Li/MachineLearning/blob/master/scikit-learn/KMean_GMM/k-means_EM-GMM.py)|
12
12
|K-Means|One of most famous and easy to understand clustering algorithm|[Source Code](https://github.com/Cheng-Lin-Li/MachineLearning/blob/master/scikit-learn/KMean_GMM/k-means_EM-GMM.py)|
13
-
|PLA| Aka Perceptron Learning Algorithm. A solver for binary classification task. |[Source Code](https://github.com/Cheng-Lin-Li/MachineLearning/blob/master/scikit-learn/PLA/sklearn-Perceptron.py)|
13
+
|PLA|Perceptron Learning Algorithm. A solver for binary classification task. |[Source Code](https://github.com/Cheng-Lin-Li/MachineLearning/blob/master/scikit-learn/PLA/sklearn-Perceptron.py)|
0 commit comments