Generative Adversarial Networks (GANs)
Overview Generative adversarial networks (GANs) a family of model architecture that use an adversarial training approach to produce a generative model capable of producing realistic synthetic data....
Overview Generative adversarial networks (GANs) a family of model architecture that use an adversarial training approach to produce a generative model capable of producing realistic synthetic data....
Overview Softmax is an ubiquitous function in modern machine learning. It is most often found as a top level component of classification loss functions like cross entropy and negative log likelihoo...
Overview Support vector machines (SVMs) are arguably the most powerful linear classification models in widespread use. They are linear models that can be trained using the hinge loss function, amon...
Overview Cross entropy is a very common loss function used for classification tasks. Conceptually, cross entropy trains the model to produce a probability distribution over a set of 2 or more clas...
Overview Contrastive learning is an approach to training neural networks using a training objective that differs in some key ways from other common training objectives. Contrastive learning can be...
Overview Vector aggregation is a common task in machine learning. The general objective behind vector aggregation is: given a set of vectors with some shape $(\ell, d)$, we want to produce a singl...
Goals In part 2 of our BatchNorm exploration, we will derive the gradients required during the backward pass of the backpropagation method of updating neural network parameters. We assume the reade...
Goals In this post, we will be seeking to understand batch normalization, which we will refer to as BatchNorm for brevity. We will specifically be focusing on understanding what BatchNorm does duri...