What defines a Neural Network as Perceptron?

6

I think it's important to talk about artificial intelligence, I've researched here in the stack overflow and I did not find anything relevant about it. Can anyone with experience explain the definition of a Perceptron Neural Network? We know that there are several types of neural networks, such as: ARTs Networks , Hopfield Network , Associative Memory , among others. But the question here is what sets a Perceptron network apart from the rest?

    
asked by anonymous 17.07.2017 / 16:12

2 answers

4

The perceptron is a solitary processing neuron with supervised learning. It receives impulses from various stimuli, then applies the relative weights of its synapses, and then emits an output signal.

A network of perceptron is a set of several perceptrons side by side, all receiving the same stimuli. As a perceptron does not interfere with the result of another perceptron, they can be understood individually without any harm at all.

  

Do not confuse with Perceptron Multi Layer, where the perceptrons are layered.

The perceptron neuron learns from its mistakes. Yes, literally. And it depends on the size of the error: the larger the error, the faster the perceptron tries to correct itself.

The output of a perceptron is a real function that receives a real number. The stimuli are transformed into a single real number through a scalar product of the stimulus force with the weight of the synapses. In summary, for X being the stimulus, p the result of the processing of the perceptron, S the weights of its synapses and f the real function of the perceptron:

p = f(y)
y = X . S

I have mentioned above that the perceptron has supervised learning, it is not self-sufficient as the Kohonen networks. Supervised learning means here that, for each entry T_i of training, there is an expected result r_i . If p_i != r_i , this means that there was a non-null error, called e_i .

On the e_i obtained for the input T_i , correct the values of S synapses in such a way that this error has been corrected or minimized in this learning turn.

The creation of the learning set and how the test elements will be presented can vary widely depending on who implements. Usually the set of tests is presented sequentially successively until a convergence criterion is reached. The convergence criterion may be the cumulative total error of the test set. Another interesting point in training is that the learning rate is usually reduced between one battery and another.

    
18.07.2017 / 07:17
3

The Perceptron is the ancient of all neural networks. Perceptron is the simplest type of direct neural network (Feedfoward), known as a linear classifier. This means that the types of problems solved by this neural network must be linearly separable

    
17.07.2017 / 20:00