Selforganizing map
From Academic Kids

The selforganising map (SOM) is a method for unsupervised learning, based on a grid of artificial neurons whose weights are adapted to match input vectors in a training set. It was first described by the Finnish professor Teuvo Kohonen and is thus sometimes referred to as a Kohonen map.
Contents 
A mathematical approach to the algorithm
Preliminary definitions
The actual training of the net is called vector quantization. "Unsupervised learning" is, technically, supervised in that we do have a desired output.
To explain the algorithm indepth, let us create a 10×10 array of nodes. Each node will contain a weight vector, and will be omniscient of its "physical location", i.e. its location in the array. The weight vector each node contains will be of the same dimension as the following input vectors. (N.B. Weights in the nodes in the map are initially set to random values.)
Now we need input to feed the map. (Note: the generated map and the given input exist in separate subspaces!) Sticking with the norm, we will create three vectors to represent colors. In the world of computing, colors have the following three components: red, green, and blue. Consequently our input vectors will have three components, each one corresponding to a color space. Our input vectors will now be thus...
R = <255, 0, 0> G = <0, 255, 0> B = <0, 0, 255>
Some variables
 Vectors are in bold
t = current iteration λ = limit on time iteration Wv = current weight vector D = target input Θ(t) = restraint due to distance from BMU α(t) = learning restraint due to time
Stepping through the algorithm
 Randomize the map's node's weight vectors
 Grab an input vector
 Traverse each node in the map
 Use Euclidean distance formula to find similarity between the input vector and the map's node's weight vector
 Track the node that produces the smallest distance (this node will be called the Best Matching Unit or BMU)
 Update the nodes in the neighbourhood of BMU by pulling them closer to the input vector
 Wv(t + 1) = Wv(t) + Θ(t)α(t)(D(t)  Wv(t))
An analytic approach to the algorithm
The SOM algorithm is fed with feature vectors, which can be of any dimension. In most applications, however, the number of dimensions will be high. Output maps can also be made in different dimensions: 1dimensional, 2dimensional, etc., but most popular are 2D and 3D maps, for SOMs are mainly used for dimensionality reduction rather than expansion.
The algorithm is explained most easily in terms of a set of artificial neurons, each having its own physical location on the output map, which take part in a winnertakeall process (a competitive network) where a node with its weight vector closest to the vector of inputs is declared the winner and its weights are adjusted making them closer to the input vector. Each node has a set of neighbours. When this node wins a competition, the neighbours' weights are also changed. They are not changed as much though. The further the neighbour is from the winner, the smaller its weight change. This process is then repeated for each input vector, over and over, for a number (usually large) of cycles. Different inputs produce different winners.
The network winds up associating output nodes with groups or patterns in the input data set. If these patterns can be named, the names can be attached to the associated nodes in the trained net.
Like most artificial neural networks, the SOM has two modes of operation:
 During the training process a map is built, the neural network organises itself, using a competitive process. The network must be given a large number of input vectors, as much as possible representing the kind of vectors that are expected during the second phase (if any). Otherwise, all input vectors must be administered several times...
 During the mapping process a new input vector may quickly be given a location on the map, it is automatically classified or categorised. There will be one single winning neuron: the neuron whose weight vector lies closest to the input vector. (This can be simply determined by calculating the Euclidean distance between input vector an weight vector.)
See Also
 Artificial ASD intelligence
 Biologicallyinspired computing
 Connectionism
 Data mining
 Generative Topographic Map
 Machine learning
 Pattern recognition
External Links
 Prof. Kohonen's website in Helsinki (http://www.cis.hut.fi/teuvo/)
 Example SOM by Robert Saunders in the form of a Java applet (source code included) (http://www.arch.usyd.edu.au/~rob/)ja:自己組織化写像