Wprowadzenie do programu RapidMiner Studio 7.6, część 9 Modele liniowe Michał Bereta www.michalbereta.pl Modele liniowe W programie RapidMiner mamy do dyspozycji kilka dyskryminacyjnych modeli liniowych jako operatory: LDA Linear Discriminant Analysis QDA Quadratic Linear Analysis RDA Regularized Discriminant Analysis Classification by Regression (może użyć dowolny model regresyjny jako subproces) Perceptron SVM (bez nieliniowej funkcji jądrowej; pamiętaj by kernel type ustawić na dot ) Regresja logistyczna, uogólniony model liniowy (mimo słowu regresja w nazwie, jest to klasyfikator!) 1
Z Dokumentacji RM: Linear Discriminant Analysis (RapidMiner Core) This operator performs linear discriminant analysis (LDA). This method tries to find the linear combination of features which best separate two or more classes of examples. The resulting combination is then used as a linear classifier. Discriminant analysis is used to determine which variables discriminate between two or more naturally occurring groups, it may have a descriptive or a predictive objective. Quadratic Discriminant Analysis (RapidMiner Core) This operator performs a quadratic discriminant analysis (QDA). QDA is closely related to linear discriminant analysis (LDA), where it is assumed that the measurements are normally distributed. Unlike LDA however, in QDA there is no assumption that the covariance of each of the classes is identical. To estimate the parameters required in quadratic discrimination more computation and data is required than in the case of linear discrimination. If there is not a great difference in the group covariance matrices, then the latter will perform as well as quadratic discrimination. Quadratic Discrimination is the general form of Bayesian discrimination. Regularized Discriminant Analysis (RapidMiner Core) The regularized discriminant analysis (RDA) is a generalization of the linear discriminant analysis (LDA) and the quadratic discreminant analysis (QDA). Both algorithms are special cases of this algorithm. If the alpha parameter is set to 1, this operator performs LDA. Similarly if the alpha parameter is set to 0, this operator performs QDA. Classification by Regression (RapidMiner Core) This operator builds a polynominal classification model through the given regression learner. The Classification by Regression operator is a nested operator i.e. it has a subprocess. The subprocess must have a regression learner i.e. an operator that generates a regression model. This operator builds a classification model using the regression learner provided in its subprocess. 2
Here is an explanation of how a classification model is built from a regression learner. For each class i of the given ExampleSet, a regression model is trained after setting the label to +1 if the label is i and to -1 if it is not. Then the regression models are combined into a classification model. This model can be applied using the Apply Model operator. In order to determine the prediction for an unlabeled example, all regression models are applied and the class belonging to the regression model which predicts the greatest value is chosen. Perceptron (RapidMiner Core) This operator learns a linear classifier called Single Perceptron which finds separating hyperplane (if existent). This operator cannot handle polynominal attributes. The perceptron is a type of artificial neural network invented in 1957 by Frank Rosenblatt. It can be seen as the simplest kind of feed-forward neural network: a linear classifier. Beside all biological analogies, the single layer perceptron is simply a linear classifier which is efficiently trained by a simple update rule: for all wrongly classified data points, the weight vector is either increased or decreased by the corresponding example values. Generalized Linear Model (nie jest tak naprawdę modelem liniowym) Generalized linear models (GLMs) are an extension of traditional linear models. This algorithm fits generalized linear models to the data by maximizing the log-likelihood. The elastic net penalty can be used for parameter regularization. The model fitting computation is parallel, extremely fast, and scales extremely well for models with a limited number of predictors with non-zero coefficients. Logistic Regression(nie jest tak naprawdę modelem liniowym) This operator is a simplified version of the Generalized Linear Model operator. To perform Logistic Regression, the Family parameter is set automatically to binomial, and the link parameter to logit. Only the most crucial parameters can be adjusted for this operator to provide an easy-to-use logistic regression. If you need a fine-tuned model, please use the Generalized Linear Model operator instead. The Logistic Regression implementation can handle training data with binominal (or 2-class polynominal) label, and both nominal and numerical feature attributes. 3
Przygotuj project: 4
kernel type: linear!!! 5
W modelu klasyfikacyjnym Classification by regression, należy wskazać jako podproces, jaki konkretnie model regresyjny ma być użyty. W tym przykładzie wykorzystujemy model regresji liniowej. Zwróć uwagę, że operator liniowej regresji ma wbudowane algorytmy wyboru atrybutów. Pytanie: czy jakość modelu pogorszy się, jeśli wyłączymy te funkcjonalności? Zbadaj dla różnych problemów klasyfikacyjnych. 6
Zwróć uwagę na duże możliwości dostrajania modelu GLM: Prostszą implementacją GLM jest operator Logistic regression : 7
Przykładowe wyniki: Perceptron SVM linear LDA QDA RDA 8
Classification by regression + linear regression Regresja logistyczna (Logistic regression) Uogólniony model liniowy (Generalized Linear model) Wynik testu ANOVA: Wyniki t-testów parami: Wniosek: pomiędzy modelami są statystycznie istotne różnice. 9
Zadanie Wykonaj obliczenia dla bazy messidor. Możesz natrafić na problemy numeryczne. W tym konkretnym uruchomieniu, źródłem problemów były operatory QDA oraz RDA. Co może być ich źródłem? Wyniki dla pozostałych modeli. Perceptron SVM linear 10
LDA Classification by regression Logistic regression GLM 11
ANOVA t-testy 12
Zadanie: 1. Które z powyższych modeli są zaimplementowane w RMS w sposób umożliwiający ich użycie dla problemów wieloklasowych? Wykonaj obliczenia dla bazy Glass: https://archive.ics.uci.edu/ml/datasets/glass+identification 2. Czy w rozważanych problemach klasyfikacyjnych uda Ci się dobrać model nieliniowy (np. NeuralNetwork, SMV z nieliniowym kernel type, drzewo decyzyjne, klasyfikator bayerowski, itd.), który byłby lepszy niż dobrze przygotowany model liniowy? 13