Fully connected layer

The simplest of NN's

A fully connected layer (FCL) is one of the earliest forms of neural networks, mostly used to classify inputs. Such a layer is characterized by having a connection (a weight) to *all* of the nodes in the previous and consequent layer.

An example fully connected layer

Fully connected layers are useful; in fact, a neural network consisting of one input layer, one output layer, and one FCL in the middle can theoretically replicate any function. Practically, the number of nodes and thus compute time in such a network becomes impractically big, and therefore more complex architectures are preferred.


Edit on GitHub