diff --git a/glossary.yml b/glossary.yml index a1f2e3cd..fcca6b9c 100644 --- a/glossary.yml +++ b/glossary.yml @@ -10120,4 +10120,85 @@ term: "non-parametric (statistics)" def: > A branch of statistical tests which do not assume a known distribution of the population which the samples were taken from (Kruskal-Wallis and Dunn test are examples of non-parametric tests). - +- slug: artificial_intelligence + ref: + - nlp + - machine_learning + en: + term: "artificial intelligence (AI)" + def: > + Intelligence demonstrated by machines, as opposed to humans or other animals. AI can be + exhibited through perceiving, synthesizing and inference of information. Example tasks include + [natural language processing](#nlp), computer vision, and [machine learning] + (#machine_learning). + +- slug: cnn + ref: + - deep_learning + - backpropagation + - perceptron + - neural_network + - machine_learning + term: "convolutional neural network (cnn)" + def: > + A class of artificial neural network that is primarily used to analyze images. A CNN has layers + that perform convolutions, where a filter is shifted over the data, instead of the general + matrix multiplications that we see in fully connected neural network layers. + +- slug: rnn + ref: + - deep_learning + - backpropagation + - perceptron + - neural_network + - machine_learning + term: recurrent neural network + def: > + A class of [artificial neural networks](#neural_network) where connections between nodes can + create a cycle. This allows the network to exhibit behavior that is dynamic over time. This + type of network is applicable to tasks like speech and handwriting recognition. + +- slug: epoch_dl + ref: + - deep_learning + - backpropagation + - perceptron + - neural_network + - machine_learning + term: epoch (deep learning) + def: > + In [deep learning](#deep_learning), an epoch is one cycle in the deep learning process where all + the training data has been fed to the algorithm once. Training a deep neural networks usually + consists of multiple epochs. + +- slug: learning_rate + ref: + - deep_learning + - backpropagation + - perceptron + - neural_network + - machine_learning + def: > + In [artificial neural networks](#neural_network), the learning rate is a hyper-parameter that + determines the pace at which the network adjusts the weights to move down the loss gradient. + A large learning rate can speed up training, but the network might overshoot and miss the + minimum. A small learning rate will overshoot less, but will be slower. It can also get more + easily stuck in local minima. + +- slug: class_imbalance + ref: + - machine_learning + def: > + Class imbalance refers to the problem in [machine learning](machine_learning) where there is an + unequal distribution of classes in the dataset. + +- slug: hidden_layer + ref: + - neural_network + - machine_learning + - deep_learning + - perceptron + def: > + A hidden layer in a [neural network](#neural_network) refers to the layers of neurons that are + not directly connected to input or output. The layers are "hidden" because you do not directly + observe their input and output values.