Table of content:
Micro-Syllabus of Unit 7 : Self-Organizing Maps (6 Hrs.) 0-5 marks
Introduction, Two Basic Feature-Mapping Models, Self-Organizing Map, Properties of the Feature Map, Contextual Maps, Hierarchical Vector Quantization, Kernel Self-Organizing Map, Relationship between Kernel SOM and Kullback-Leibler Divergence
🗒️Note:→
# Introduction of Self Organizing Map :
- A self-organizing map (SOM) is a type of ANN that is trained using unsupervised learning to produce a low-dimensional (typically two-dimensional) discretized representation of the input space of the training samples.
- This low dimensional representation can be viewed as a map. Therefore SOM is a method to do dimensionality reduction.
- Self-organizing maps differ from other artificial neural networks as they apply competitive learning as opposed to error-correction learning.
- In SOM all inputs are fully connected with the output neurons and those neurons competes with each other. An output neuron that win the competition is called winning neuron or winner takes all neuron.
- Synaptic weights are adjusted in the favor of winning neuron so that when same or similar input pattern is presented to the neuron there will be high chance of winning the competition for the neuron.
- This means weights of winning neurons are updated such that Euclidean distance between the input and weights of the neuron is minimized.
- In SOM, output neurons are organized in the form of one or two dimensional lattices. Higher dimensional lattices of neurons are also possible but are not practically common.