Category: Concepts

Dataset Distillation

Dataset distillation is the process of creating a smaller synthetic dataset that retains the performance characteristics of a larger real dataset when used for training models.

Warm Starting

Warm Starting refers to the practice of using the solution of a previous problem as the starting point for solving a new, related problem, which can improve convergence speed.

DINO

DINO is a self-supervised learning method that uses knowledge distillation to learn visual representations without labeled data.

Self-Supervised Learning

Self-supervised learning is a type of machine learning where the model learns to predict parts of the input from other parts, often using unlabeled data.