SNNS (Stuttgart Neural Network Simulator)
is a software simulator for neural networks on Unix workstations developed at the Institute for Parallel and Distributed High Performance Systems (IPVR) at the University of Stuttgart. The goal of the SNNS project is to create an efficient and flexible simulation environment for research on and application of neural nets.
http://en.wikipedia.org/wiki/SNNS
Petron, E. 1999. Stuttgart Neural Network Simulator: Exploring connectionism and machine learning with SNNS. Linux J. 1999, 63es (Jul. 1999), 2.
Neural Network Toolbox™ extends MATLAB®
with tools for designing, implementing, visualizing, and simulating neural networks.
282p
6.1 Introduction
LMS (Least mean squares) algorithms
multilayer neural networks / multilayer Perceptrons
The parameters governing the nonlinear mapping are learned at the same time as those governing the linear discriminant.
http://en.wikipedia.org/wiki/Multilayer_perceptron
http://en.wikipedia.org/wiki/Neural_network
the number of layers of units
(eg. the two-layer networks = single layer networks)
backpropagation algorithm / generalized delta rule
: a natural extension of the LMS algorithm
- the intuitive graphical representation & the simplicity of design of models
- the conceptual and algorithmic simplicity
network architecture / topology
- neural net classification
- heuristic model selection (through choices in the number of hidden layers, units, feedback connections, and so on.
regularization
= complexity adjustment
284p
6.2 Feedforward Operation and Classification
http://en.wikipedia.org/wiki/Artificial_neural_network
the function of units : "neurons"
the input units : the components of a feature vector
signals emiited by output units : the values of the discriminant functions used for classification
hidden units : the weighted sum of its inputs
-> net activation : the inner product of the inputs with the weights at the hidden unit
the input-to-hidden layer weights : "synapses" -> "synaptic weights"
activation function : "nonlinearity" of a unit
"Each ouput unit computes its net activation based on the hidden unit signals."
http://en.wikipedia.org/wiki/Feedforward_neural_network
286p
6.2.1 General Feedforward Operation
287p
6.2.2 Expressive Power of Multilayer Networks
"Any desired function can be implemented by a three-layer network."
-> but, the problems of designing and training neural networks
288p
http://en.wikipedia.org/wiki/Backpropagation
- the problem of setting the weights (based on training patterns and the desired output)
- supervised training of multilayer neural networks
- the natural extension of the LMS algorithm for linear systems
- based on gradient descent
credit assignment problem
: no explicit teacher to state what the hidden unit's output should be
=> The power of backpropagation to calculate an effective error for each hidden unit, and thus derive a learning rule for the input-to-hidden weights
sigmoid
6.3.1 Network Learning
The final signals emitted by the network are used as discriminant functions for classification. During network training, these output signals are compared with a teaching or target vector t, and any difference is used in training the weights throughout the network
training error -> LMS algorithm
gradient descent -> the weights are changed in a direction that will reduce the error
6.3.2 Training Protocol
6.3.3 Learning Curves
6.4 Error Surfaces
6.8.1 Activation Function
'@GSMC > 김경환: Pattern Recognition' 카테고리의 다른 글
ch.7 Stochastic Methods (0) | 2008.12.03 |
---|---|
ch.5 Linear Discriminant Functions (0) | 2008.11.02 |
ch.4 Nonparametric Techniques (0) | 2008.10.21 |
ch.3 Maximum-Likelihood and Bayesian Parameter Estimation (0) | 2008.10.08 |
ch.2 Bayesian Decision Theory (0) | 2008.09.25 |