top of page
Search

Avoiding Overfitting

(13th-June-2020) • Overfitting can occur when some regularities appear in the training data that do not appear in the test data, and when...

Ensemble Learning

(12th-June-2020) • In ensemble learning, an agent takes a number of learning algorithms and combines their output to make a prediction....

Backpropagation learning

(11th-June-2020) • Back-propagation learning is gradient descent search through the parameter space to minimize the sum-of-squares error.

Backpropagation

(10th-June-2020) • Backpropagation is a method to calculate the gradient of the loss function with respect to the weights in an...

Neural Networks

(8th-June-2020) • Neural networks are a popular target representation for learning. These networks are inspired by the neurons in the...

A support vector machine (SVM)

(7th-June-2020) • A support vector machine (SVM) is used for classification. It uses functions of the original inputs as the inputs of...

Composite Models

(6th-June-2020) • Decision trees, (squashed) linear functions, and Bayesian classifiers provide the basis for many other supervised...

Bayesian Classifiers

(4th-June-2020) • A Bayesian classifier is based on the idea that the role of a (natural) class is to predict the values of features for...

Logistic functions are often used in neural networks

(3rd-June-2020) • Logistic functions are often used in neural networks to introduce nonlinearity in the model and/or to clamp signals to...

Squashed Linear Functions

(1st-June-2020) The use of a linear function does not work well for classification tasks. When there are only two values, say 0 and 1, a...

an algorithm, LinearLearner(X,Y,E,η),

(31th-May-2020) An algorithm, LinearLearner(X,Y,E,η), for learning a linear function for minimizing the sum-of-squares error. Note that,...

Searching for a Good Decision Tree

(30th-May-2020) Linear Regression and Classification Linear functions provide the basis for many learning algorithms. In this section, we...

Learning Decision Trees

(29th-May-2020) A decision tree is a simple representation for classifying examples. Decision tree learning is one of the most successful...

Basic Models for Supervised Learning

(28th-May-2020) A learned model is a representation of a function from the input features to the target features. Most supervised...

Probabilities from Experts

(27th-May-2020) The use of pseudocounts also gives us a way to combine expert opinion and data. Often a single agent does not have good...

Learning Probabilities

(26th-May-2020) For many of the prediction measures, the optimal prediction on the training data is the empirical frequency. Thus, making...

Evaluating Predictions -II

(25th-May-2020) • Point Estimates with No Input Features • The simplest case for learning is when there are no input features and where...

bottom of page