0%

metric_learning-1

Self-training with Noisy Student improves ImageNet classification

The idea is to train a noisy student no smaller than the teacher and repeat.

1
2
3
While not converge
1. Train a teacher net to provide pseudo-labels (unoised).
2. Train a larger student model with pseudo and groundtruth labels (noised).

Embedding Expansion: Augmentation in Embedding Space for Deep Metric Learning

Idea: Proposes an embedding augmentation and combines with representation learning.
Motivation: To generate hard-synthsis with easy samples without using GAN.
Related Work: query expansion and database augmentation.
Method: Loss/Negative Pair Mining

Momentum Contrast for Unsupervised Visual Representation Learning

Motivation: Close the gap between unsupervised learning and supervised learning.
Idea: Reformulate contrastive matching as dictionary look-up
Method

  1. Loss function: maximize softmax on the positive key
  2. Implement the dictionary as a queue and maintains a momentum update on the key encoder

A Simple Framework for Contrastive Learning of Visual Representations

Method

  1. Data augmentation plays a crucial role
  2. Nonlinear transformation between representation and loss is crucial
  3. Larger batch_size and more training steps compared to supervised training

Learning Diverse Fashion Collocation by Neural Graph Filtering

Motivation: To achieve compatibility, diversity and flexibility requirements of fashion
collocation.

Highlights:
1. Edge-centric graph operations with permutation-invariant symmetric aggregation function
2. Use focal loss to handle imbalance.
3. New dataset for style classification.