scalation.modeling.classifying
Members list
Type members
Classlikes
The BaggingTrees class uses several randomly built descision trees for classification. It randomly selects sub-samples of bRatio * x.dim size from the data x and y to build nTrees decision trees. The classify method uses voting from all of the trees. Note: this classifier does not select sub-features to build the trees.
The BaggingTrees class uses several randomly built descision trees for classification. It randomly selects sub-samples of bRatio * x.dim size from the data x and y to build nTrees decision trees. The classify method uses voting from all of the trees. Note: this classifier does not select sub-features to build the trees.
Value parameters
- cname_
-
the class names
- conts
-
the set of feature indices for variables that are treated as continuous
- fname_
-
the names of the variables/features
- hparam
-
the hyper-parameters
- k
-
the number of classes
- x
-
the data matrix (instances by features)
- y
-
the response/class labels of the instances
Attributes
- Companion
- object
- Supertypes
- Known subtypes
-
class RandomForest
The BaggingTrees companion object provides a factory method.
The BaggingTrees companion object provides a factory method.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
BaggingTrees.type
The BayesClassifier trait provides methods for Bayesian Classifiers, including calculations of joint probabilities and Conditional Mutual Information (CMI). Make sure the variable values start at zero, otherwise call the shift2zero method. If the value counts (vc) are unknown, the vc_fromData method may be called. Classifier.shift2zero (x) // make sure values for all features start at zero val vc = Classifier.vc_fromData (x) // set value counts from data
The BayesClassifier trait provides methods for Bayesian Classifiers, including calculations of joint probabilities and Conditional Mutual Information (CMI). Make sure the variable values start at zero, otherwise call the shift2zero method. If the value counts (vc) are unknown, the vc_fromData method may be called. Classifier.shift2zero (x) // make sure values for all features start at zero val vc = Classifier.vc_fromData (x) // set value counts from data
Value parameters
- k
-
the number of classes (defaults to binary (2-way) classification
Attributes
- See also
-
bayesClassifierTestfor calculating cmi andbayesClassifierTest2for cmiMatrix - Supertypes
-
class Objecttrait Matchableclass Any
- Known subtypes
-
class TANBayes
The Classifier trait provides a framework for multiple predictive analytics techniques, e.g., NaiveBayes. x is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector analog p_y, the response probability mass function (pmf)
The Classifier trait provides a framework for multiple predictive analytics techniques, e.g., NaiveBayes. x is multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector analog p_y, the response probability mass function (pmf)
Value parameters
- cname
-
the names/labels for each class
- fname
-
the feature/variable names (if null, use x_j's)
- hparam
-
the hyper-parameters for the model
- k
-
the number of classes (categorical response values)
- x
-
the input/data m-by-n matrix
- y
-
the response/output m-vector (class values where y(i) = class for instance i)
Attributes
- Companion
- object
- Supertypes
- Known subtypes
-
class BaggingTreesclass RandomForestclass DecisionTree_C45class DecisionTree_C45wpclass DecisionTree_ID3class DecisionTree_ID3wpclass HiddenMarkovclass KNN_Classifierclass LinDiscAnalyisclass NaiveBayesclass NaiveBayesRclass NullModelclass SimpleLDAclass SimpleLogisticRegressionclass LogisticRegressionclass SupportVectorMachineclass TANBayesclass NeuralNet_2L_Ckclass NeuralNet_3L_C2Show all
The Classifier companion object provides a method for testing predictive models.
The Classifier companion object provides a method for testing predictive models.
Attributes
- Companion
- trait
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
Classifier.type
The DecisionTree companion object provides the hyper-parameters for the decision trees, bagging trees, and random forests.
The DecisionTree companion object provides the hyper-parameters for the decision trees, bagging trees, and random forests.
Attributes
- See also
-
scalation.modeling.HyperParameter - Companion
- trait
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
DecisionTree.type
The DecisionTree trait provides common capabilities for all types of decision trees.
The DecisionTree trait provides common capabilities for all types of decision trees.
Attributes
- Companion
- object
- Supertypes
-
class Objecttrait Matchableclass Any
- Known subtypes
The DecisionTree_C45 class implements a Decision Tree classifier using the C45 algorithm. The classifier is trained using a data matrix x and a classification vector y. Each data vector in the matrix is classified into one of k classes numbered 0, ..., k-1. Each column in the matrix represents a feature (e.g., Humidity).
The DecisionTree_C45 class implements a Decision Tree classifier using the C45 algorithm. The classifier is trained using a data matrix x and a classification vector y. Each data vector in the matrix is classified into one of k classes numbered 0, ..., k-1. Each column in the matrix represents a feature (e.g., Humidity).
Value parameters
- cname_
-
the names for all classes
- conts
-
the set of feature indices for variables that are treated as continuous
- fname_
-
the names for all features/variables
- hparam
-
the hyper-parameters
- k
-
the number of classes
- x
-
the input/data matrix with instances stored in rows
- y
-
the response/classification vector, where y_i = class for row i of matrix x
Attributes
- Companion
- object
- Supertypes
-
trait DecisionTreetrait FitCtrait FitMtrait Classifiertrait Modelclass Objecttrait Matchableclass AnyShow all
- Known subtypes
-
class DecisionTree_C45wp
The DecisionTree_C45 companion object provides factory methods.
The DecisionTree_C45 companion object provides factory methods.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
DecisionTree_C45.type
The DecisionTree_C45wp class extends DecisionTree_C45 with pruning capabilities. The base class uses the C45 algorithm to construct a decision tree for classifying instance vectors.
The DecisionTree_C45wp class extends DecisionTree_C45 with pruning capabilities. The base class uses the C45 algorithm to construct a decision tree for classifying instance vectors.
Value parameters
- cname_
-
the names for all classes
- conts
-
the set of feature indices for variables that are treated as continuous
- fname_
-
the names for all features/variables
- hparam
-
the hyper-parameters
- k
-
the number of classes
- x
-
the input/data matrix with instances stored in rows
- y
-
the response/classification vector, where y_i = class for row i of matrix x
Attributes
- Companion
- object
- Supertypes
-
class DecisionTree_C45trait DecisionTreetrait FitCtrait FitMtrait Classifiertrait Modelclass Objecttrait Matchableclass AnyShow all
The DecisionTree_C45wp companion object provides a factory function.
The DecisionTree_C45wp companion object provides a factory function.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
DecisionTree_C45wp.type
The DecisionTree_ID3 class implements a Decision Tree classifier using the ID3 algorithm. The classifier is trained using a data matrix x and a classification vector y. Each data vector in the matrix is classified into one of k classes numbered 0, ..., k-1. Each column in the matrix represents a feature (e.g., Humidity).
The DecisionTree_ID3 class implements a Decision Tree classifier using the ID3 algorithm. The classifier is trained using a data matrix x and a classification vector y. Each data vector in the matrix is classified into one of k classes numbered 0, ..., k-1. Each column in the matrix represents a feature (e.g., Humidity).
Value parameters
- cname_
-
the name for each class
- fname_
-
the name for each feature/variable xj
- hparam
-
the hyper-parameters
- k
-
the number of classes
- x
-
the input/data m-by-n matrix with instances stored in rows
- y
-
the response/classification m-vector, where y_i = class for row i of matrix x
Attributes
- Companion
- object
- Supertypes
-
trait DecisionTreetrait FitCtrait FitMtrait Classifiertrait Modelclass Objecttrait Matchableclass AnyShow all
- Known subtypes
-
class DecisionTree_ID3wp
The DecisionTree_ID3 companion object provides a factory method.
The DecisionTree_ID3 companion object provides a factory method.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
DecisionTree_ID3.type
The DecisionTree_ID3wp class extends DecisionTree_ID3 with pruning capabilities. The base class uses the ID3 algorithm to construct a decision tree for classifying instance vectors.
The DecisionTree_ID3wp class extends DecisionTree_ID3 with pruning capabilities. The base class uses the ID3 algorithm to construct a decision tree for classifying instance vectors.
Value parameters
- cname_
-
the name for each class
- fname_
-
the name for each feature/variable xj
- hparam
-
the hyper-parameters
- k
-
the number of classes
- x
-
the input/data m-by-n matrix with instances stored in rows
- y
-
the response/classification m-vector, where y_i = class for row i of matrix x
Attributes
- Companion
- object
- Supertypes
-
class DecisionTree_ID3trait DecisionTreetrait FitCtrait FitMtrait Classifiertrait Modelclass Objecttrait Matchableclass AnyShow all
The DecisionTree_ID3wp companion object provides a factory function.
The DecisionTree_ID3wp companion object provides a factory function.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
DecisionTree_ID3wp.type
The Example_BreastCancer object loads the breast cancer dataset for classifying whether a patient has breast cancer.
The Example_BreastCancer object loads the breast cancer dataset for classifying whether a patient has breast cancer.
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
Example_BreastCancer.type
The Example_Diabetes object loads the dibetes dataset for classifying whether a patient has diabetes.
The Example_Diabetes object loads the dibetes dataset for classifying whether a patient has diabetes.
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
Example_Diabetes.type
The Example_Iris object is used to test all classifiers. This is the well-known classification problem on how to classify a flower val x = xy(?, 1 until 5) // columns 1, 2, 3, 4 val y = xy(?, 5).toInt // column 5
The Example_Iris object is used to test all classifiers. This is the well-known classification problem on how to classify a flower val x = xy(?, 1 until 5) // columns 1, 2, 3, 4 val y = xy(?, 5).toInt // column 5
Attributes
- See also
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
Example_Iris.type
The Example_MTcars object provides the Motor Trend Car Road Tests dataset (mtcars) as a combined xy matrix.
The Example_MTcars object provides the Motor Trend Car Road Tests dataset (mtcars) as a combined xy matrix.
Attributes
- See also
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
Example_MTcars.type
The Example_PlayTennis object is used to test all integer based classifiers. This is the well-known classification problem on whether to play tennis based on given weather conditions. Applications may need to slice 'xy'. val x = xy.not(0, 4) // columns 0, 1, 2, 3 val y = xy(?, 4) // column 4
The Example_PlayTennis object is used to test all integer based classifiers. This is the well-known classification problem on whether to play tennis based on given weather conditions. Applications may need to slice 'xy'. val x = xy.not(0, 4) // columns 0, 1, 2, 3 val y = xy(?, 4) // column 4
Attributes
- See also
-
euclid.nmu.edu/~mkowalcz/cs495f09/slides/lesson004.pdf
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
Example_PlayTennis.type
The Example_PlayTennis_Cont object is used to test integer/continuous classifiers. This is the well-known classification problem on whether to play tennis based on given weather conditions. Applications may need to slice 'xy'. The 'Cont' version uses continuous values for Temperature and Humidity. val x = xy.not (?, 4) // columns 0, 1, 2, 3 val y = xy(?, 4) // column 4
The Example_PlayTennis_Cont object is used to test integer/continuous classifiers. This is the well-known classification problem on whether to play tennis based on given weather conditions. Applications may need to slice 'xy'. The 'Cont' version uses continuous values for Temperature and Humidity. val x = xy.not (?, 4) // columns 0, 1, 2, 3 val y = xy(?, 4) // column 4
Attributes
- See also
-
euclid.nmu.edu/~mkowalcz/cs495f09/slides/lesson004.pdf
sefiks.com/2018/05/13/a-step-by-step-c4-5-decision-tree-example
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
The FitC companion object records the indices and labels for the base Quality of Fit (QoF) metrics/measures for the classification techniques.
The FitC trait provides methods for determining the confusion matrix as well as derived Quality of Fit (QoF) measures such as pseudo R-squared, sst, sse, accuracy, precision, recall, specificity and Cohen's kappa coefficient.
The FitC trait provides methods for determining the confusion matrix as well as derived Quality of Fit (QoF) measures such as pseudo R-squared, sst, sse, accuracy, precision, recall, specificity and Cohen's kappa coefficient.
Value parameters
- k
-
the number distinct class values/labels (defaults to 2)
Attributes
- See also
-
modeling.FitMust call the confusion method before calling the other methods. - Companion
- object
- Supertypes
- Known subtypes
-
class BaggingTreesclass RandomForestclass DecisionTree_C45class DecisionTree_C45wpclass DecisionTree_ID3class DecisionTree_ID3wpclass HiddenMarkovclass KNN_Classifierclass LinDiscAnalyisclass NaiveBayesclass NaiveBayesRclass NullModelclass SimpleLDAclass SimpleLogisticRegressionclass LogisticRegressionclass SupportVectorMachineclass TANBayesclass NeuralNet_2L_Ckclass NeuralNet_3L_C2Show all
The HiddenMarkov classes provides Hidden Markov Models (HMM). An HMM model consists of a probability vector pi and probability matrices a and b. The discrete-time system is characterized by a hidden state x(t) and an observed symbol/value y(t) at time t, which may be viewed as a time series. pi(i) = P(x(t) = i) a(i, j) = P(x(t+1) = j | x(t) = i) b(i, k) = P(y(t) = k | x(t) = i) model (pi, a, b)
The HiddenMarkov classes provides Hidden Markov Models (HMM). An HMM model consists of a probability vector pi and probability matrices a and b. The discrete-time system is characterized by a hidden state x(t) and an observed symbol/value y(t) at time t, which may be viewed as a time series. pi(i) = P(x(t) = i) a(i, j) = P(x(t+1) = j | x(t) = i) b(i, k) = P(y(t) = k | x(t) = i) model (pi, a, b)
Value parameters
- a
-
the state transition probability matrix (n-by-n)
- b
-
the observation probability matrix (n-by-m)
- cname_
-
the class names for the states, e.g., ("Hot", "Cold")
- hparam
-
the hyper-parameters
- m
-
the number of observation symbols/values {0, 1, ... m-1}
- n
-
the number of (hidden) states in the model
- pi
-
the probabilty vector for the initial state
- y
-
the observation vector/observed discrete-valued time series
Attributes
- Companion
- object
- Supertypes
The HiddenMarkov companion object provides a convenience method for testing.
The HiddenMarkov companion object provides a convenience method for testing.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
HiddenMarkov.type
The KNN_Classifier class is used to classify a new vector z into one of k classes. It works by finding its kappa nearest neighbors. These neighbors essentially vote according to their classification. The class with most votes is selected as the classification of z. Using a distance metric, the kappa vectors nearest to z are found in the training data, which is stored row-wise in the data matrix x. The corresponding classifications are given in the vector y, such that the classification for vector x(i) is given by y(i). FIX - cross validation uses test data for decision making, so when kappa = 1, acc = 100%
The KNN_Classifier class is used to classify a new vector z into one of k classes. It works by finding its kappa nearest neighbors. These neighbors essentially vote according to their classification. The class with most votes is selected as the classification of z. Using a distance metric, the kappa vectors nearest to z are found in the training data, which is stored row-wise in the data matrix x. The corresponding classifications are given in the vector y, such that the classification for vector x(i) is given by y(i). FIX - cross validation uses test data for decision making, so when kappa = 1, acc = 100%
Value parameters
- cname_
-
the names for all classes
- fname_
-
the names for all features/variables
- hparam
-
the hyper-parameters
- k
-
the number of classes
- kappa
-
the number of nearest neighbors to consider (k >= 3)
- x
-
the input/data matrix
- y
-
the classification of each vector in x
Attributes
- Companion
- object
- Supertypes
The KNN_Classifier companion object provides a factory method.
The KNN_Classifier companion object provides a factory method.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
KNN_Classifier.type
The LinDiscAnalyis class implements a Linear Discriminant Analysis (LDA) classifier. It places a vector into a group according to its maximal discriminant function. FIX - currently only works when the number of classes k = 2.
The LinDiscAnalyis class implements a Linear Discriminant Analysis (LDA) classifier. It places a vector into a group according to its maximal discriminant function. FIX - currently only works when the number of classes k = 2.
Value parameters
- cname_
-
the names for all classes
- fname_
-
the names for all features/variables
- hparam
-
the hyper-parameters
- k
-
the number of classes (k in {0, 1, ...k-1}
- x
-
the real-valued training/test data vectors stored as rows of a matrix
- y
-
the training/test classification vector, where y_i = class for row i of the matrix x
Attributes
- See also
-
en.wikipedia.org/wiki/Linear_discriminant_analysis
- Companion
- object
- Supertypes
The LinDiscAnalyis companion object provides a factory method.
The LinDiscAnalyis companion object provides a factory method.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
LinDiscAnalyis.type
The LogisticRegression class supports (binomial) logistic regression. In this case, x may be multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector b in the logistic regression equation logit (p_y) = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e where e represents the residuals (the part not explained by the model) and y is now binary.
The LogisticRegression class supports (binomial) logistic regression. In this case, x may be multi-dimensional [1, x_1, ... x_k]. Fit the parameter vector b in the logistic regression equation logit (p_y) = b dot x + e = b_0 + b_1 * x_1 + ... b_k * x_k + e where e represents the residuals (the part not explained by the model) and y is now binary.
Value parameters
- cname_
-
the names for both classes
- fname_
-
the names for all features/variables
- hparam
-
the hyper-parameters
- x
-
the input/design matrix augmented with a first column of ones
- y
-
the binary response vector, y_i in {0, 1}
Attributes
- See also
-
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf
- Companion
- object
- Supertypes
-
class SimpleLogisticRegressiontrait FitCtrait FitMtrait Classifiertrait Modelclass Objecttrait Matchableclass AnyShow all
The LogisticRegression companion object provides a factory method.
The LogisticRegression companion object provides a factory method.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
LogisticRegression.type
The classifier is naive, because it assumes variable/feature independence and therefore simply multiplies the conditional probabilities.
The NaiveBayes class implements an Integer-Based Naive Bayes Classifier, which is a commonly used such classifier for discrete input data. The classifier is trained using a data matrix x and a classification vector y. Each data vector in the matrix is classified into one of k classes numbered 0, ..., k-1. Prior probabilities are calculated based on the population of each class in the training-set. Relative posterior probabilities are computed by multiplying these by values computed using conditional probabilities. stored in Conditional Probability Tables (CPTs).
The classifier is naive, because it assumes variable/feature independence and therefore simply multiplies the conditional probabilities.
Value parameters
- cname_
-
the name for each class
- fname_
-
the name for each feature/variable xj
- hparam
-
the hyper-parameters
- k
-
the number of classes
- vc
-
the value count (number of distinct values) for each feature/variable xj
- x
-
the input/data m-by-n matrix with instances stored in rows
- y
-
the response/classification m-vector, where y_i = class for row i of matrix x
Attributes
- Companion
- object
- Supertypes
NaiveBayes is the companion object for the NaiveBayes class.
NaiveBayes is the companion object for the NaiveBayes class.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
NaiveBayes.type
The NaiveBayesR class implements a Gaussian Naive Bayes Classifier, which is the most commonly used such classifier for continuous input data. The classifier is trained using a data matrix x and a classification vector y. Each data vector in the matrix is classified into one of k classes numbered 0, ..., k-1. Class probabilities are calculated based on the frequency of each class in the training-set. Relative probabilities are computed by multiplying these by values computed using conditional density functions based on the Normal (Gaussian) distribution. The classifier is naive, because it assumes feature independence and therefore simply multiplies the conditional densities.
The NaiveBayesR class implements a Gaussian Naive Bayes Classifier, which is the most commonly used such classifier for continuous input data. The classifier is trained using a data matrix x and a classification vector y. Each data vector in the matrix is classified into one of k classes numbered 0, ..., k-1. Class probabilities are calculated based on the frequency of each class in the training-set. Relative probabilities are computed by multiplying these by values computed using conditional density functions based on the Normal (Gaussian) distribution. The classifier is naive, because it assumes feature independence and therefore simply multiplies the conditional densities.
Value parameters
- cname_
-
the names for all classes
- fname_
-
the names for all features/variables
- hparam
-
the hyper-parameters
- k
-
the number of classes
- x
-
the real-valued data vectors stored as rows of a matrix
- y
-
the class vector, where y_i = class for row i of the matrix x, x(i)
Attributes
- Companion
- object
- Supertypes
NaiveBayesR is the companion object for the NaiveBayesR class.
NaiveBayesR is the companion object for the NaiveBayesR class.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
NaiveBayesR.type
The Node class is used to hold information about a node in the decision tree.
The Node class is used to hold information about a node in the decision tree.
Value parameters
- gn
-
the information gain recorded at this node
- j
-
the feature/variable number used for splitting (negative => leaf)
- leaf
-
whether the node is a leaf (terminal node)
- nu
-
the frequency count
- parent
-
the parent node (null for root)
- y
-
the response/decision value
Attributes
- Companion
- object
- Supertypes
-
trait Serializabletrait Producttrait Equalstrait Cloneableclass Objecttrait Matchableclass AnyShow all
The NullModel class implements a Null Model Classifier, which is a simple classifier for discrete input data. The classifier is trained just using a classification vector y. Picks the most frequent class. Each data instance is classified into one of k classes numbered 0, ..., k-1. Note: the train method in the super class suffices.
The NullModel class implements a Null Model Classifier, which is a simple classifier for discrete input data. The classifier is trained just using a classification vector y. Picks the most frequent class. Each data instance is classified into one of k classes numbered 0, ..., k-1. Note: the train method in the super class suffices.
Value parameters
- cname_
-
the names for all classes
- k
-
the number of distinct vcalues/classes
- y
-
the response/output m-vector (class values where y(i) = class for instance i)
Attributes
- Companion
- object
- Supertypes
NullModel is the companion object for the NullModel class provides a factory method for creating null models.
The QoFC enum defines the Quality of Fit (QoF) measures for classifiers.
The QoFC enum defines the Quality of Fit (QoF) measures for classifiers.
Value parameters
- name
-
the name of ther QoF measure/metric
Attributes
- Supertypes
-
trait Enumtrait Serializabletrait Producttrait Equalsclass Objecttrait Matchableclass AnyShow all
The RandomForest class uses randomness for building descision trees in classification. It randomly selects sub-samples with size = bRatio * sample-size from the sample (with replacement) and uses the fbRatio fraction of sub-features to build the trees, and to classify by voting from all of the trees.
The RandomForest class uses randomness for building descision trees in classification. It randomly selects sub-samples with size = bRatio * sample-size from the sample (with replacement) and uses the fbRatio fraction of sub-features to build the trees, and to classify by voting from all of the trees.
Value parameters
- cname_
-
class names (array of string)
- conts
-
the set of feature indices for variables that are treated as continuous
- fname_
-
feature names (array of string)
- hparam
-
the hyper-parameters
- k
-
the number of classes
- x
-
the data matrix (instances by features)
- y
-
the response class labels of the instances
Attributes
- Companion
- object
- Supertypes
-
class BaggingTreestrait FitCtrait FitMtrait Classifiertrait Modelclass Objecttrait Matchableclass AnyShow all
The RandomForest companion object provides a factory method.
The RandomForest companion object provides a factory method.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
RandomForest.type
The SimpleLDA class implements a Linear Discriminant Analysis (LDA) classifier. It places a value into a group according to its maximal discriminant function.
The SimpleLDA class implements a Linear Discriminant Analysis (LDA) classifier. It places a value into a group according to its maximal discriminant function.
Value parameters
- cname_
-
the names for all classes
- fname_
-
the name for the feature/variable
- hparam
-
the hyper-parameters
- k
-
the number of possible values for y (0, 1, ... k-1)
- x
-
the input/design matrix with only one column
- y
-
the response/classification vector, y_i in {0, 1}
Attributes
- See also
-
en.wikipedia.org/wiki/Linear_discriminant_analysis
- Supertypes
The SimpleLogisticRegression class supports (binomial) logistic regression. In this case, x is two-dimensional [1, x_1]. Fit the parameter vector b in the logistic regression equation logit (p_y) = b dot x + e = b_0 + b_1 * x_1 + e where e represents the residuals (the part not explained by the model) and y is now binary.
The SimpleLogisticRegression class supports (binomial) logistic regression. In this case, x is two-dimensional [1, x_1]. Fit the parameter vector b in the logistic regression equation logit (p_y) = b dot x + e = b_0 + b_1 * x_1 + e where e represents the residuals (the part not explained by the model) and y is now binary.
Value parameters
- cname_
-
the names for both classes
- fname_
-
the names for all features/variables
- hparam
-
the hyper-parameters
- x
-
the input/design matrix augmented with a first column of ones
- y
-
the binary response vector, y_i in {0, 1}
Attributes
- See also
-
see.stanford.edu/materials/lsoeldsee263/05-ls.pdf
- Companion
- object
- Supertypes
- Known subtypes
-
class LogisticRegression
The SimpleLogisticRegression companion object provides factory methods.
The SimpleLogisticRegression companion object provides factory methods.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
The SupportVectorMachine class is a translation of Pseudo-Code from a modified SMO (Modification 2) found at the above URL's into Scala and includes a few simplifications (e.g., currently only works for linear kernels, dense data and binary classification).
The SupportVectorMachine class is a translation of Pseudo-Code from a modified SMO (Modification 2) found at the above URL's into Scala and includes a few simplifications (e.g., currently only works for linear kernels, dense data and binary classification).
Value parameters
- cname_
-
the class names
- fname_
-
the feature/variable names
- hparam
-
the hyper-parameters
- x
-
the input/data matrix with points stored as rows
- y
-
the classification of the data points stored in a vector
Attributes
- Companion
- object
- Supertypes
The SupportVectorMachine companion object provides a factory method.
The SupportVectorMachine companion object provides a factory method.
Attributes
- Companion
- class
- Supertypes
-
class Objecttrait Matchableclass Any
- Self type
-
SupportVectorMachine.type
The classifier is TAN allowing each xj to utilize information from its x-parant.
The TANBayes class implements an Integer-Based TAN Bayes Classifier, which is a commonly used such classifier for discrete input data. The classifier is trained using a data matrix x and a classification vector y. Each data vector in the matrix is classified into one of k classes numbered 0, ..., k-1. Prior probabilities are calculated based on the population of each class in the training-set. Relative posterior probabilities are computed by multiplying these by values computed using conditional probabilities. stored in Conditional Probability Tables (CPTs).
The classifier is TAN allowing each xj to utilize information from its x-parant.
Value parameters
- cname_
-
the names of the classes
- fname_
-
the names of the features/variables
- hparam
-
the hyper-parameters
- k
-
the number of classes
- vc
-
the value count (number of distinct values) for each feature
- x
-
the input/data m-by-n matrix
- y
-
the class vector, where y(i) = class for row i of matrix x
Attributes
- Companion
- object
- Supertypes
-
trait FitCtrait FitMtrait BayesClassifiertrait Classifiertrait Modelclass Objecttrait Matchableclass AnyShow all
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Attributes
- Supertypes
-
class Objecttrait Matchableclass Any
Value members
Concrete methods
The baggingTreesTest main function tests the BaggingTrees class. It tests a simple case that does not require a file to be read.
The baggingTreesTest main function tests the BaggingTrees class. It tests a simple case that does not require a file to be read.
runMain scalation.modeling.classifying.baggingTreesTest
Attributes
The BaggingTreesTest2 main function tests the BaggingTrees class. It tests the Bagging Trees classifier using well-known WineQuality Dataset.
The BaggingTreesTest2 main function tests the BaggingTrees class. It tests the Bagging Trees classifier using well-known WineQuality Dataset.
runMain scalation.modeling.classifying.BaggingTreesTest2
Attributes
The baggingTreesTest3 main function tests the BaggingTrees class. It tests the Bagging Trees classifier using a specific numbers of trees.
The baggingTreesTest3 main function tests the BaggingTrees class. It tests the Bagging Trees classifier using a specific numbers of trees.
runMain scalation.modeling.classifying.baggingTreesTest3
Attributes
The baggingTreesTest4 main function tests the BaggingTrees class. It tests Bagging Trees using unseen data.
The baggingTreesTest4 main function tests the BaggingTrees class. It tests Bagging Trees using unseen data.
runMain scalation.modeling.classifying.baggingTreesTest4
Attributes
The baggingTreesTest5 main function tests the BaggingTrees class. It tests the Bagging Trees classifier by specific numbers of trees.
The baggingTreesTest5 main function tests the BaggingTrees class. It tests the Bagging Trees classifier by specific numbers of trees.
runMain scalation.modeling.classifying.baggingTreesTest5
Attributes
The baggingTreesTest6 main function tests the BaggingTrees class. It tests the Bagging Trees classifier on Breast Cancer dataset.
The baggingTreesTest6 main function tests the BaggingTrees class. It tests the Bagging Trees classifier on Breast Cancer dataset.
runMain scalation.modeling.classifying.baggingTreesTest6
Attributes
The baggingTreesTest7 main function tests the BaggingTrees class. It tests the Bagging Trees classifier on Diabetes dataset.
The baggingTreesTest7 main function tests the BaggingTrees class. It tests the Bagging Trees classifier on Diabetes dataset.
runMain scalation.modeling.classifying.baggingTreesTest7
Attributes
The bayesClassifierTest main function is used to test the BayesClassifier Calculate the CMI I(x; z | y) and it should be 0.15834454180428106.
The bayesClassifierTest main function is used to test the BayesClassifier Calculate the CMI I(x; z | y) and it should be 0.15834454180428106.
Attributes
- See also
-
stackoverflow.com/questions/55402338/finding-conditional-mutual-information-from-3-discrete-variable
runMain scalation.modeling.classifying.bayesClassifierTest
The bayesClassifierTest2 main function is used to test the BayesClassifier class using the Play Tennis Example. Calculate the CMI Matrix I(xj; xl | y) for j < l
The bayesClassifierTest2 main function is used to test the BayesClassifier class using the Play Tennis Example. Calculate the CMI Matrix I(xj; xl | y) for j < l
runMain scalation.modeling.classifying.bayesClassifierTest2
Attributes
The classifierTest main function is used to test the Classifier trait and its derived classes using the Example_PlayTennis dataset containing data matrices x and response vector y.
The classifierTest main function is used to test the Classifier trait and its derived classes using the Example_PlayTennis dataset containing data matrices x and response vector y.
Attributes
- See also
-
Example_PlayTennisrunMain scalation.modeling.classifierTest
The decisionTreeTest main function is used to test the DecisionTree class.
The decisionTreeTest main function is used to test the DecisionTree class.
runMain scalation.modeling.classifying.decisionTreeTest
Attributes
The decisionTree_C45Test object tests the DecisionTree_C45 class. Ex: Classify (No/Yes) whether a person will play tennis based on the measured features.
The decisionTree_C45Test object tests the DecisionTree_C45 class. Ex: Classify (No/Yes) whether a person will play tennis based on the measured features.
Attributes
- See also
-
www.cise.ufl.edu/~ddd/cap6635/Fall-97/Short-papers/2.htm
runMain scalation.modeling.classifying.decisionTree_C45Test
The decisionTree_C45Test2 main function tests the DecisionTree_C45 class. Ex: Classify (No/Yes) whether a person will play tennis based on the measured features.
The decisionTree_C45Test2 main function tests the DecisionTree_C45 class. Ex: Classify (No/Yes) whether a person will play tennis based on the measured features.
Attributes
- See also
-
www.cise.ufl.edu/~ddd/cap6635/Fall-97/Short-papers/2.htm
runMain scalation.modeling.classifying.decisionTree_C45Test2
The decisionTree_C45Test3 main function tests the DecisionTree_C45 class. Ex: Classify whether a there is breast cancer.
The decisionTree_C45Test3 main function tests the DecisionTree_C45 class. Ex: Classify whether a there is breast cancer.
runMain scalation.modeling.classifying.decisionTree_C45Test3
Attributes
The decisionTree_C45Test4 main function tests the DecisionTree_C45 class. Ex: Classify the quality of white wine.
The decisionTree_C45Test4 main function tests the DecisionTree_C45 class. Ex: Classify the quality of white wine.
runMain scalation.modeling.classifying.decisionTree_C45Test4
Attributes
The decisionTree_C45Test5 main function tests the DecisionTree_C45 class. Ex: Classify whether the patient has diabetes or not
The decisionTree_C45Test5 main function tests the DecisionTree_C45 class. Ex: Classify whether the patient has diabetes or not
runMain scalation.modeling.classifying.decisionTree_C45Test5
Attributes
The decisionTree_C45wpTest main function tests the DecisionTree_C45wp class.
The decisionTree_C45wpTest main function tests the DecisionTree_C45wp class.
runMain scalation.modeling.classifying.decisionTree_C45wpTest
Attributes
The decisionTree_C45wpTest2 main function tests the DecisionTree_C45wp class.
The decisionTree_C45wpTest2 main function tests the DecisionTree_C45wp class.
runMain scalation.modeling.classifying.decisionTree_C45wpTest2
Attributes
The decisionTree_ID3Test main function tests the DecisionTree_ID3 class. Ex: Classify (No/Yes) whether a person will play tennis based on the measured features.
The decisionTree_ID3Test main function tests the DecisionTree_ID3 class. Ex: Classify (No/Yes) whether a person will play tennis based on the measured features.
Attributes
- See also
-
www.cise.ufl.edu/~ddd/cap6635/Fall-97/Short-papers/2.htm
runMain scalation.modeling.classifying.decisionTree_ID3Test
The decisionTree_ID3Test2 main function tests the DecisionTree_ID3 class. Ex: Classify whether a there is breast cancer.
The decisionTree_ID3Test2 main function tests the DecisionTree_ID3 class. Ex: Classify whether a there is breast cancer.
runMain scalation.modeling.classifying.decisionTree_ID3Test2
Attributes
The decisionTree_ID3Test3 main function is used to test the DecisionTree_ID3 class. Plot entropy.
The decisionTree_ID3Test3 main function is used to test the DecisionTree_ID3 class. Plot entropy.
runMain scalation.modeling.classifying.decisionTree_ID3Test3
Attributes
The decisionTree_ID3wpTest main function tests the DecisionTree_ID3wp class.
The decisionTree_ID3wpTest main function tests the DecisionTree_ID3wp class.
runMain scalation.modeling.classifying.decisionTree_ID3wpTest
Attributes
The decisionTree_ID3wpTest2 main function tests the DecisionTree_ID3wp class.
The decisionTree_ID3wpTest2 main function tests the DecisionTree_ID3wp class.
runMain scalation.modeling.classifying.decisionTree_ID3wpTest2
Attributes
The decisionTree_ID3wpTest3 main function tests the DecisionTree_ID3wp class.
The decisionTree_ID3wpTest3 main function tests the DecisionTree_ID3wp class.
runMain scalation.modeling.classifying.decisionTree_ID3wpTest3
Attributes
The example_BreastCancerTest main function tests 16 of 18 classifiers on the Breast Cancer dataset. Ex: Classify whether a there is breast cancer.
The example_BreastCancerTest main function tests 16 of 18 classifiers on the Breast Cancer dataset. Ex: Classify whether a there is breast cancer.
BaggingTrees, DecisionTree_C45, DecisionTree_C45wp, DecisionTree_ID3, DecisionTree_ID3wp, HiddenMarkov, KNN_Classifier, LinDiscAnalyis, LogisticRegression, NaiveBayes, NaiveBayesR, NeuralNet_3L_C2, NullModel, RandomForest, SupportVectorMachine, TANBayes.
Require having only a single feature: SimpleLDA, SimpleLogisticRegression => SKIP
runMain scalation.modeling.classifying.example_BreastCancerTest
Attributes
The example_DiabetesTest main function tests 16 of 18 classifiers on the Breast Cancer dataset. Ex: Classify whether a there is breast cancer.
The example_DiabetesTest main function tests 16 of 18 classifiers on the Breast Cancer dataset. Ex: Classify whether a there is breast cancer.
BaggingTrees, DecisionTree_C45, DecisionTree_C45wp, DecisionTree_ID3, DecisionTree_ID3wp, HiddenMarkov, KNN_Classifier, LinDiscAnalyis, LogisticRegression, NaiveBayes, NaiveBayesR, NeuralNet_3L_C2, NullModel, RandomForest, SupportVectorMachine, TANBayes.
Require having only a single feature: SimpleLDA, SimpleLogisticRegression => SKIP
runMain scalation.modeling.classifying.example_DiabetesTest
Attributes
The example_IrisTest main function tests 16 of 18 classifiers on the Iris dataset. As this is an easy classification problem, classifiers should be nearly perfect.
The example_IrisTest main function tests 16 of 18 classifiers on the Iris dataset. As this is an easy classification problem, classifiers should be nearly perfect.
BaggingTrees, DecisionTree_C45, DecisionTree_C45wp, DecisionTree_ID3, DecisionTree_ID3wp, HiddenMarkov, KNN_Classifier, LinDiscAnalyis, LogisticRegression, NaiveBayes, NaiveBayesR, NeuralNet_3L_C2, NullModel, RandomForest, SupportVectorMachine, TANBayes.
Require having only a single feature: SimpleLDA, SimpleLogisticRegression => SKIP
runMain scalation.modeling.classifying.example_IrisTest
Attributes
The example_PlayTennisTest test several classifiers on the Play Tennis dataset. Tests all classes that extend from Classifier.
The example_PlayTennisTest test several classifiers on the Play Tennis dataset. Tests all classes that extend from Classifier.
runMain scalation.modeling.classifying.example_PlayTennisTest
Attributes
The example_PalyTennis_ContTest test several classifiers on the (cont) Play Tennis dataset. Tests all classes that extend from Classifier and include continuous predictors.
The example_PalyTennis_ContTest test several classifiers on the (cont) Play Tennis dataset. Tests all classes that extend from Classifier and include continuous predictors.
runMain scalation.modeling.classifying.Example_PlayTennis_ContTest
Attributes
The fitCTest main function is used to test the FitC trait.
The fitCTest main function is used to test the FitC trait.
runMain scalation.modeling.classifying.fitCTest
Attributes
The fitCTest2 main function is used to test the FitC trait.
The fitCTest2 main function is used to test the FitC trait.
Attributes
- See also
-
www.quora.com/How-do-I-compute-precision-and-recall-values-for-a-dataset
runMain scalation.modeling.classifying.fitCTest2
The fitCTest3 main function is used to test the FitC trait.
The fitCTest3 main function is used to test the FitC trait.
Attributes
- See also
-
www.quora.com/How-do-I-compute-precision-and-recall-values-for-a-dataset
runMain scalation.modeling.classifying.fitCTest3
The fitCTest4 main function is used to test the FitC class.
The fitCTest4 main function is used to test the FitC class.
Attributes
- See also
-
towardsdatascience.com/multi-class-metrics-made-simple-part-i-precision-and-recall-9250280bddc2 Note: ScalaTion's confusion matrix is the transpose of the one on the Website
runMain scalation.modeling.classifying.fitCTest4
The hiddenMarkovTest main function is used to test the HiddenMarkov class. Given model (pi, a, b), determine the probability of the observations y.
The hiddenMarkovTest main function is used to test the HiddenMarkov class. Given model (pi, a, b), determine the probability of the observations y.
Attributes
- See also
-
www.cs.sjsu.edu/~stamp/RUA/HMM.pdf (exercise 1).
runMain scalation.modeling.classifying.hiddenMarkovTest
The hiddenMarkovTest2 main function is used to test the HiddenMarkov class. Train the model (pi, a, b) based on the observed data.
The hiddenMarkovTest2 main function is used to test the HiddenMarkov class. Train the model (pi, a, b) based on the observed data.
Attributes
- See also
-
www.cs.sjsu.edu/~stamp/RUA/HMM.pdf.
runMain scalation.modeling.classifying.hiddenMarkovTest2
The hiddenMarkovTest3 main function is used to test the HiddenMarkov class. Given model (pi, a, b), determine the probability of the observations y.
The hiddenMarkovTest3 main function is used to test the HiddenMarkov class. Given model (pi, a, b), determine the probability of the observations y.
Attributes
- See also
-
"Introduction to Data Science using ScalaTion"
runMain scalation.modeling.classifying.hiddenMarkovTest3
The hiddenMarkovTest4 main function is used to test the HiddenMarkov class. Train the model (pi, a, b) based on the observed data.
The hiddenMarkovTest4 main function is used to test the HiddenMarkov class. Train the model (pi, a, b) based on the observed data.
Attributes
- See also
-
"Introduction to Data Science using ScalaTion"
runMain scalation.modeling.classifying.hiddenMarkovTest4
The kNN_ClassifierTest main function is used to test the KNN_Classifier class.
The kNN_ClassifierTest main function is used to test the KNN_Classifier class.
runMain scalation.modeling.classifying.kNN_ClassifierTest
Attributes
The kNN_ClassifierTest2 main function is used to test the KNN_Classifier class.
The kNN_ClassifierTest2 main function is used to test the KNN_Classifier class.
runMain scalation.modeling.classifying.kNN_ClassifierTest2
Attributes
The kNN_ClassifierTest3 main function is used to test the KNN_Classifier class. It uses the Iris dataset where the classification/response y is unbalanced.
The kNN_ClassifierTest3 main function is used to test the KNN_Classifier class. It uses the Iris dataset where the classification/response y is unbalanced.
runMain scalation.modeling.classifying.kNN_ClassifierTest3
Attributes
The kNN_ClassifierTest4 main function is used to test the KNN_Classifier class. It uses the Iris dataset where the classification/response y is imbalanced and downsampling is used to balance the classification.
The kNN_ClassifierTest4 main function is used to test the KNN_Classifier class. It uses the Iris dataset where the classification/response y is imbalanced and downsampling is used to balance the classification.
runMain scalation.modeling.classifying.kNN_ClassifierTest4
Attributes
The linDiscAnalyisTest main function is used to test the LinDiscAnalyis class.
The linDiscAnalyisTest main function is used to test the LinDiscAnalyis class.
Attributes
- See also
-
people.revoledu.com/kardi/tutorial/LDA/Numerical%20Example.html
runMain scalation.modeling.classifying.linDiscAnalyisTest
The logisticRegressionTest main function tests the LogisticRegression class on the mtcars dataset.
The logisticRegressionTest main function tests the LogisticRegression class on the mtcars dataset.
Attributes
- See also
-
Example_MTcars.scala
www.cookbook-r.com/Statistical_analysis/Logistic_regression/ Answer: b = (-8.8331, 0.4304), n_dev = 43.860, r_dev = 25.533, aci = 29.533, pseudo_rSq = 0.4178
runMain scalation.modeling.classifying.logisticRegressionTest
The logisticRegressionTest2 main function tests the LogisticRegression class.
The logisticRegressionTest2 main function tests the LogisticRegression class.
Attributes
- See also
-
statmaster.sdu.dk/courses/st111/module03/index.html
www.stat.wisc.edu/~mchung/teaching/.../GLM.logistic.Rpackage.pdf
runMain scalation.modeling.classifying.logisticRegressionTest2
The naiveBayesRTest main function is used to test the NaiveBayesR class.
The naiveBayesRTest main function is used to test the NaiveBayesR class.
Attributes
- See also
-
people.revoledu.com/kardi/tutorial/LDA/Numerical%20Example.html
runMain scalation.modeling.classifying.naiveBayesRTest
The naiveBayesRTest2 main function is used to test the NaiveBayesR class. Ex: Classify whether a person is male (M) or female (F) based on the measured features.
The naiveBayesRTest2 main function is used to test the NaiveBayesR class. Ex: Classify whether a person is male (M) or female (F) based on the measured features.
Attributes
- See also
-
en.wikipedia.org/wiki/Naive_Bayes_classifier
runMain scalation.modeling.classifying.naiveBayesRTest2
The naiveBayesTest main function is used to test the NaiveBayes class.
The naiveBayesTest main function is used to test the NaiveBayes class.
runMain scalation.modeling.classifying.naiveBayesTest
Attributes
The naiveBayesTest2 main function is used to test the NaiveBayes class. Classify whether a car is more likely to be stolen (1) or not (1).
The naiveBayesTest2 main function is used to test the NaiveBayes class. Classify whether a car is more likely to be stolen (1) or not (1).
Attributes
- See also
-
www.inf.u-szeged.hu/~ormandi/ai2/06-naiveBayes-example.pdf
runMain scalation.modeling.classiying.naiveBayesTest2
The naiveBayesTest3 main function is used to test the NaiveBayes class. Given whether a person is Fast and/or Strong, classify them as making C = 1 or not making C = 0 the football team.
The naiveBayesTest3 main function is used to test the NaiveBayes class. Given whether a person is Fast and/or Strong, classify them as making C = 1 or not making C = 0 the football team.
runMain scalation.modeling.classiying.naiveBayesTest3
Attributes
The naiveBayesTest4 main function is used to test the NaiveBayes class.
The naiveBayesTest4 main function is used to test the NaiveBayes class.
Attributes
- See also
-
archive.ics.uci.edu/ml/datasets/Lenses
docs.roguewave.com/imsl/java/7.3/manual/api/com/imsl/datamining/NaiveBayesClassifierEx2.html
runMain scalation.modeling.classiying.naiveBayesTest4
The nullModelTest main function is used to test the NullModel class. Classify whether to play tennis(1) or not (0).
The nullModelTest main function is used to test the NullModel class. Classify whether to play tennis(1) or not (0).
runMain scalation.modeling.classifying.nullModelTest
Attributes
The randomForestTest main function is used to test the RandomForest class. It tests a simple case that does not require a file to be read.
The randomForestTest main function is used to test the RandomForest class. It tests a simple case that does not require a file to be read.
runMain scalation.modeling.classifying.randomForestTest
Attributes
The randomForestTest2 main function is used to test the RandomForest class. It tests the Random Forest classifier using well-known WineQuality Dataset.
The randomForestTest2 main function is used to test the RandomForest class. It tests the Random Forest classifier using well-known WineQuality Dataset.
runMain scalation.modeling.classifying.randomForestTest2
Attributes
The randomForestTest3 main function is used to test the RandomForest class. It tests the Random Forest classifier by specific numbers of trees.
The randomForestTest3 main function is used to test the RandomForest class. It tests the Random Forest classifier by specific numbers of trees.
runMain scalation.modeling.classifying.randomForestTest3
Attributes
The randomForestTest4 main function is used to test the RandomForest class. It tests RF using unseen data.
The randomForestTest4 main function is used to test the RandomForest class. It tests RF using unseen data.
runMain scalation.modeling.classifying.randomForestTest4
Attributes
The randomForestTest5 main function is used to test the RandomForest class. It tests the Random Forest classifier by specific numbers of trees.
The randomForestTest5 main function is used to test the RandomForest class. It tests the Random Forest classifier by specific numbers of trees.
runMain scalation.modeling.classifying.randomForestTest5
Attributes
The randomForestTest6 main function is used to test the RandomForest class. It tests the Random Forest classifier by specific numbers of trees.
The randomForestTest6 main function is used to test the RandomForest class. It tests the Random Forest classifier by specific numbers of trees.
runMain scalation.modeling.classifying.randomForestTest6
Attributes
The randomForestTest7 main function is used to test the RandomForest class. It tests the Random Forest classifier by specific numbers of trees.
The randomForestTest7 main function is used to test the RandomForest class. It tests the Random Forest classifier by specific numbers of trees.
runMain scalation.modeling.classifying.randomForestTest7
Attributes
The simpleLDATest main function tests the SimpleLDA class.
The simpleLDATest main function tests the SimpleLDA class.
Attributes
- See also
-
people.revoledu.com/kardi/tutorial/LDA/Numerical%20Example.html
runMain scalation.modeling.classifying.SimpleLDATest
The simpleLDATest2 main function tests the SimpleLDA class.
The simpleLDATest2 main function tests the SimpleLDA class.
runMain scalation.modeling.classifying.simpleLDATest2
Attributes
The simpleLogisticRegressionTest main function tests the SimpleLogisticRegression class on the mtcars dataset. Use built-in optimizer.
The simpleLogisticRegressionTest main function tests the SimpleLogisticRegression class on the mtcars dataset. Use built-in optimizer.
Attributes
- See also
-
Example_MTcarswww.cookbook-r.com/Statistical_analysis/Logistic_regression/ Answer: b = (-8.8331, 0.4304), n_dev = 43.860, r_dev = 25.533, aic = 29.533, pseudo_rSq = 0.4178
runMain scalation.modeling.classifying.simpleLogisticRegressionTest
The simpleLogisticRegressionTest3 main function tests the SimpleLogisticRegression class. Compare SimpleLogisticRegressionTest with SimpleRegression.
The simpleLogisticRegressionTest3 main function tests the SimpleLogisticRegression class. Compare SimpleLogisticRegressionTest with SimpleRegression.
Attributes
- See also
-
www.cookbook-r.com/Statistical_analysis/Logistic_regression/ Answer: b = (-8.8331, 0.4304), n_dev = 43.860, r_dev = 25.533, aic = 29.533, pseudo_rSq = 0.4178
runMain scalation.modeling.classifying.simpleLogisticRegressionTest3
The simpleLogisticRegressionTest4 main function tests the SimpleLogisticRegression class.
The simpleLogisticRegressionTest4 main function tests the SimpleLogisticRegression class.
Attributes
- See also
-
people.revoledu.com/kardi/tutorial/LDA/Numerical%20Example.html
runMain scalation.modeling.classifying.simpleLogisticRegressionTest4
The simpleLogisticRegressionTest5 main function tests the SimpleLogisticRegression class.
The simpleLogisticRegressionTest5 main function tests the SimpleLogisticRegression class.
runMain scalation.modeling.classifying.simpleLogisticRegressionTest5
Attributes
The simpleLogisticRegressionTest6 main function tests the logistic function.
The simpleLogisticRegressionTest6 main function tests the logistic function.
runMain scalation.modeling.classifying.simpleLogisticRegressionTest6
Attributes
The SupportVectorMachineTest main function tests the SupportVectorMachine class.
The SupportVectorMachineTest main function tests the SupportVectorMachine class.
runMain scalation.modeling.classifying.supportVectorMachineTest
Attributes
The supportVectorMachineTest2 main function tests the SupportVectorMachine class.
The supportVectorMachineTest2 main function tests the SupportVectorMachine class.
runMain scalation.modeling.classifying.supportVectorMachineTest2
Attributes
The tANBayesTest main function is used to test the TANBayes class on the Play Tennis example problem.
The tANBayesTest main function is used to test the TANBayes class on the Play Tennis example problem.
runMain scalation.modeling.classifying.tANBayesTest
Attributes
The tANBayesTest2 main function is used to test the TANBayes class. Classify whether a car is more likely to be stolen (1) or not (1).
The tANBayesTest2 main function is used to test the TANBayes class. Classify whether a car is more likely to be stolen (1) or not (1).
Attributes
- See also
-
www.inf.u-szeged.hu/~ormandi/ai2/06-tANBayes-example.pdf
runMain scalation.modeling.classiying.tANBayesTest2
The tANBayesTest3 main function is used to test the TANBayes class. Given whether a person is Fast and/or Strong, classify them as making C = 1 or not making C = 0 the football team.
The tANBayesTest3 main function is used to test the TANBayes class. Given whether a person is Fast and/or Strong, classify them as making C = 1 or not making C = 0 the football team.
runMain scalation.modeling.classiying.tANBayesTest3
Attributes
The tANBayesTest4 main function is used to test the TANBayes class.
The tANBayesTest4 main function is used to test the TANBayes class.
Attributes
- See also
-
archive.ics.uci.edu/ml/datasets/Lenses
docs.roguewave.com/imsl/java/7.3/manual/api/com/imsl/datamining/TANBayesClassifierEx2.html
runMain scalation.modeling.classiying.tANBayesTest4