A biological neural network (BNN) is a structure that consists of synapses, dendrites, cell bodies, and axons. In this neural network, the processing is carried out by neurons. Dendrites receive signals from other neurons, Soma sums all the incoming signals and the axon transmits the signals to other cells. BNNs can process highly complex parallel inputs, however, they lack any controlling mechanism and face slow processing.
Meanwhile, an artificial neural network (ANN) is composed of artificial neurons, or nodes, that include input, output, and hidden layers. Each node connects to others, sending data to the next layer if the output of another node is above its associated weight and threshold. ANNs are trained from scratch usually using a fixed topology, although the topology of ANN can also change depending on the problem being solved. The weights of an ANN are randomly initialized and adjusted via an optimization algorithm. ANNs can learn about multiple types of data (Linear or Nonlinear). ANN is highly volatile and serves best in financial time series forecasting. However, ANN’s architecture makes it difficult to explain output.
While artificial neurons and perceptrons were inspired by biological neural networks, they do differ s in several ways:
- Size: our brain contains about 86 billion neurons and more than 100 trillion (or according to some estimates 1000 trillion) synapses (connections). The number of “neurons” in artificial networks is much less than that (usually in the ballpark of 10–1000).
- Topology: Artificial layers compute one by one, instead of being part of a network that has nodes computing asynchronously. Meanwhile, In biological networks, neurons can fire asynchronously in parallel, and have a small-world nature with a small portion of highly connected neurons and a large amount of lesser connected ones.
- Speed: Certain biological neurons can fire around 200 times a second on average. Information in artificial neurons is carried over by the continuous, floating-point number values of weights.
- Number of neurons: A typical ANN consists of hundreds to millions of neurons while a BNN contains billions.
- Signals: In BNNs, an action potential is either triggered or not — biological synapses either carry a signal or they don’t. While Perceptrons in ANNs work somewhat similarly, artificial neurons accept continuous values as inputs. The timing of the signals is also synchronous, where artificial neurons in the same layer receive their input signals and then send their output signals all at once
- Flexibility: ANNs can support use cases like more efficient image recognition via CNNs or training across decentralized imports via FL. Alternatively, BNNs offer differences such as neuroplasticity and entirely different forms of activation.
Additional Reading About the Foundations of Neural Networks:
- An article comparing neurons to neural networks.
- A video comparing AI to the human brain.
- An overview of how neural networks learn.