Neural Network
FundamentalsA computing system inspired by biological neural networks that learns to perform tasks by considering examples without being explicitly programmed.
A neural network is a computational model composed of layers of interconnected nodes (neurons) that process information using connectionist approaches. Each connection between neurons carries a weight that is adjusted during training, allowing the network to learn patterns from data.
Neural networks typically consist of an input layer, one or more hidden layers, and an output layer. During forward propagation, data flows through these layers, with each neuron applying a weighted sum followed by an activation function to produce its output. The network learns by comparing its predictions to expected outputs and adjusting weights through backpropagation.
Modern neural networks form the backbone of most AI systems today, from image classifiers to language models. Their ability to approximate complex functions and learn hierarchical representations of data has made them the dominant paradigm in machine learning research and applications.
Related Terms
Last updated: February 20, 2026