⌁ An Intro to Neural Networks ⌁

Parts of a neural network

This page will go into detail about the final part of a network that truly makes it a learning computer program: backpropagation. In addition, it will cover the mathematics of neural networks.

Perhaps the most important function of neural networks, backpropagation is responsible for changing synaptic weights in order to regulate the "strength" of input signals. It is the final box in the comic on page 1.
In backpropagation, a network analyzes how far off its prediction was after making a prediction and then changes the synaptic weights to a (hopefully) better value and then does another prediction. This process is repeated until the network reaches the minimum possible error. As a step-by-step process, backpropagation can be seen as follows (under the assumption that the network has already completed one prediction):

  1. The network calculates its error: the expected (correct) output is subtracted from the actual output of the network
  2. The network caclulates how much each synapse caused the error and then changes each weight accordingly

The second step is done as follows:

new weights = error * weights * neuron values * inputs

It should be noted that the weights, new weights, neuron values, and inputs are represented by vectors. For example if we have 3 inputs, 2 neurons, and 6 synapses, the function might look like this:
new weights = 0.4 * (0.7, 0.3, -0.3, 0.6, 0.6, 0.5) * (1.06, 0.94) * (1, 0.4, 0.8)

This page is a WIP