top of page
Search

Estimation and Learning of generation model

11th-October-2020 For estimation, use conditional probability of hidden layer when input is given Benefits of generated models Input data...

Deep Learning using generation model

10th-October-2020 • In the deep learning method described so far, the behavior of the network is described deterministically • In the...

Activation function

6th-October-2020

Model of V2

3rd-October-2020 V2 model with sparse RBM - Reproduce realistic V2 neuron response [Ito - Komatsu 04] - Respond to more complex shapes...

Visual information processing of the brain

2nd-October-2020 Structure of the visual cortex (ventral cortical visual pathway) - propagation with feed forward - hierarchy: simple...

Local contrast normalization

1st-October-2020 Model of Behavior of Visual Field Neuron - Even in Image Recognition - Explanation Based on Natural Image Statistics...

Sparse auto encoder

(30th-September-2020) Then, Topographic ICA is here.

Visual information processing of the brain

(29th-September-2020) Gabor wavelet - position / orientation / scale - Topographic map Sparse Coding is here.

Improved example of CNN: Tiled CNN

(28th-September-2020) CNN can acquire only simple shift invariance - Local receptive field + tied weights • Tiled CNN - Different filter...

Random filter: the importance of architecture

(27th-September-2020) Let the filter be random, learning only the uppermost layer fully-connected layer The architecture is much more...

method and pooling case study

(26th-September-2020) "Further data and more early GPUs can gain immediate performance improvement" [Krizhevsky + 12] • Large-scale...

Non-NN method and pooling

(25th-September-2020) Familiar with general object recognition - SIFT [Rowe 99] (Descriptor generation) - Bag of Features - Spatial...

Convolutional Neural Network (CNN) Cont.

(24th-September-2020) Convolution + pooling is repeated (= Deep CNN) to acquire invariance to various deformations • Learnable - target:...

Convolutional Neural Network (CNN)

(23rd-September-2020) Neocognitron's Roots [Fukushima 80] • Backpropagation Application to Learning and Handwritten Character Recognition...

Effect of pre-Learning

(22nd-September-2020) • Pre-training prevents excessive learning - Weight obtained by pre-training as initial value • All...

bottom of page