Artificial neuron
Artificial neurons are processing units based on the biological neural model. The first artificial neuron model was created by McCullough and Pitts, and then newer and more complex models have appeared. Since the connectivity in the biological neurons is higher, artificial neurons must be considered as only an approximation to the biological model.
Artifical neurons can be organized and connected in order to create artificial neural networks, which often process the data carried through the neural connections in different layers. Learning algorithms can also be applied to artificial neural networks in order to modify their behavior.
Behavior
Input signals are multiplied by values called weights producing a new value called activation. After the activation is produced, it gets filtered if the input doesn't exceed a value called threshold.
The neuron behavior can be mapped into a pattern hyperspace, which can be separated for one or more decision hypersurfaces (just curves in 2D and surfaces in 3D).
Transfer Functions
Transfer functions is the name given for the functions which apply the threshold to the activation value. This functions can be discrete or continuous, and they also can be defined as step functions.
Impulse pass
Depending on the network model, neurons can pass their impulses to their terminals, or backwards. The "backward pass" can be observed in learning algorithms like "Backpropagation".
Analogy to Biological Neurons
In biological neurons there is a similar behavior. Inputs are electrical pulses transmitted to the synapses (terminals in the dendrites). Electrical pulses produce a release of neurotransmitters which may alter the dendritic membrane potential (Post Synaptic Potential). The Post Synaptic Potential travels over the axon, reaching another neuron, which will sum all the Post Synaptic Potentials received, and fire an output if the total sum of the Post Synaptic Potentials in the axon hillock received exceeds a threshold.