Learning in neural networks

9

How does learning in neural networks occur? What is the concept behind? What is your relationship with "Deep Learning"?

    
asked by anonymous 22.08.2015 / 12:16

2 answers

10

You should really buy a book on the subject if you are really interested. But the basics (and put it basic in this) are as follows:

1) A neuron has a number of inputs, and only one output. The output can be seen as a "Decision" taken on the basis of inputs.

2)Theoutputoftheneuroniswell-behaved,thatis,itisavalueinapredeterminedrange(somethinglike"between 0 and +1"), even though the neuron inputs are of much greater magnitude. >

3) To calculate the output, the neuron assigns a different "weight" to each of the inputs, makes a weighted linear sum of the various inputs. The "weights" of each input can be changed.

output_linear = peso_a . input_a + peso_b . input_b + ...

Of course, if one of the inputs is very large, even if its weight is small it will eventually dominate the output.

The "weights" stored in each neuron are the system memory.

4) For the output to be "well-behaved" the result of the linear sum is compressed by a non-linear function, such as the sigmoid function:

output = 1 / (1 + exp(-output_linear))

The use of a nonlinear function in the output is one of the aspects that ensures that a neural network can "learn" any function.

5) A single neuron, also called Perceptron, is already useful for some simple decisions, such as stopping the car or walking at a crossroads. One input is the red signal, the other may be an approaching ambulance (whose weight must be high because it has more priority than the red signal), etc.

A Perceptron would also be able to calculate how much soap the washing machine should use depending on some variables, or what the selling price of a product is for it to make a profit.

6) A more capable neural network than the Perceptron has one or more hidden layers, that is, groups of neurons that are neither directly connected to the input nor to the output, forming a mesh of synapses (connections between neurons).

Anextremelysimplefunction,suchastheXOR(orexclusive)function,cannotbelearnedbyaPerceptron,butcanbelearnedbyaneuralnetworkwithahiddenlayer.Byabusingabitofmetaphor,aPerceptrondoesnotlearnfunctionswith"altruistic" characteristics.

7) Through the backpropagation mechanism, it is possible to "train" a neural network. For this, there must be a learning phase, where the network neurons are submitted to a certain set of inputs, and the error (difference between observed and expected output) is calculated. The error is used to recalculate the weights of the neural network, from front to back (starting from the output neuron and from there towards the inputs).

If the neurons make use of a nonlinear function for output, it can be proved that the neural network can "learn" any function via backpropagation.

The process of learning and functioning of the neural network is essentially statistical, analogous to fuzzy logic. A neural network trained to recognize letters will always respond with a degree of uncertainty (instead of "this letter is A", the output would be something like "95% chance of being the letter A").

Finally, an article where I compare neurons with economic agents, perhaps interest: link

    
23.08.2015 / 05:03
3

Neural networks are one of the most famous types of machine learning algorithms and their main idea is basically to mimic the behavior of the human brain. If you have some knowledge in programming and statistics, you will understand better how these algorithms work.

The difference from one neural network to another is the training process. From the examples shown to it, the neural network will adjust its parameters according to the answers. For example, to train a neural network to classify news, examples of news stories should be displayed to it. That is, the neural network regulates the "synapses" of the "brain" to automatically classify new examples.

Deep Learning is already a deeper, broader and more complex machine learning machine.

For more information on subject and some concepts visit this link .

    
24.08.2015 / 16:19