Table of content:
Micro-Syllabus of Unit 5 : Multilayer Perceptron (8 Hrs.) 20 marks fix
Introduction, Batch Learning and On-Line Learning, The Back Propagation Algorithm, XOR problem, Heuristics for Making the back propagation Algorithm Perform Better, Back Propagation and Differentiation, The Hessian and Its Role in On-Line Learning, Optimal Annealing and Adaptive Control of the Learning Rate, Generalization, Approximations of Functions, Cross Validation, Complexity Regularization and Network Pruning, Virtues and Limitations of Back Propagation Learning, Supervised Learning Viewed as Optimization Problem, Convolutional Networks, Nonlinear Filtering, Small Scale Versus Large-Scale Learning Problems
🗒️Note:→
# Introduction to Multilayer Perceptron :
- A multilayer feed-forward network consists of an input layer, one or more hidden layers, and an output layer. Computations take place in the hidden and output layers only.
- The input signal propagates through the network in a forward direction layer-by-layer. Such neural networks are called multilayer perceptrons(MLPs).
- They have been successfully applied to many difficult and diverse problems.
- Multilayer perceptrons are typically trained using so-called error back propagation algorithm. This is a supervised error-correction learning algorithm.
Properties :