Yann LeCun’s paper gets rejected from NeurIPS 2021

Greetings! Some links on this site are affiliate links. That means that, if you choose to make a purchase, The Click Reader may earn a small commission at no extra cost to you. We greatly appreciate your support!

As per Yann LeCun’s posts on Facebook, LinkedIn and Twitter, NeurIPS 2021 has rejected his and his co-authors, Adrien Bardes and Jean Pounce, paper on ‘VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning‘.

On May 12, 2021, Yann went onto Twitter and posted a thread about the paper to show how the three authors developed a simple and effective method for self-supervised training of joint-embedding architectures:

The news about the rejection by NeurIPS 2021 came today.

What is the paper on VICReg about?

The paper titled ‘VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning‘ is currently available on arXiv and proposes a regularization method that explicitly avoids the collapse problem in self-supervised methods with a simple regularization term on the variance of the embeddings along each dimension individually.

The architecture of VICReg follows recent trends in self-supervised learning and is based on joint embedding learning with siamese networks. As LeCun summarizes it, VICReg is a non-contrastive loss function for joint-embedding architectures with three terms:

  1. Variance: Hinge loss on the standard deviation of each component of Gx(x) and Gy(y) which maintains the variance above a given margin within a batch. This is the main innovation in VICReg.
  2. Invariance: Euclidean distance between embedding vectors Gx(x) and Gy(y).
  3. Covariance: sum of the squares of the off-diagonal terms of the covariance matrices of Gx(x) and Gy(y) over a batch. This is inspired by Barlow Twins. It pulls up the information content of the embedding vectors by decorrelating their components.
VICReg: Variance-Invariance-Covariance Regularization for Self-Supervised Learning

The proposed architecture was tested using the ImageNet dataset which showed competitive results.

VICReg - Evaluation on ImageNet

If you want to read the paper, please click here: Read the paper!

Why did NeurIPS 2021 reject the VICReg paper?

Although Yann LeCun hasn’t disclosed the reason for the rejection on his social media, he goes on to saying the following:

“Since we posted the VICReg paper, numerous groups have picked up on the idea, and 12 of them have posted papers that cite it. If we had decided to not post the paper on ArXiv, these works would not have existed.

In fact, if we really wanted to be strict about anonymity, we could not even post the paper now because we are likely to resubmit it to another double-blind conference.

About 15 years ago, before ArXiv posting became commonplace, I found myself in situations where the first paper in a series was rejected multiple times (and not posted on arXiv), and a follow-up paper was accepted. But we could not refer to the original work!

The speed at which our field is progressing is too fast for the traditional reviewing system and conference cycles. We can either choose to slow down progress and respect obsolete rules, or realize that the rules are only there to make it easier to count points (not to make research better) and refuse to slow down progress by obeying them.”

The comment made by him represents how he would like the review system to speed up such review cycles in the future so that people do not have to wait around for their papers to be published.

In Conclusion

What do you think about this rejection by NeurIPS 2021? Let us know in the comments and we will get back to you.


Yann LeCun's paper gets rejected from NeurIPS 2021Yann LeCun's paper gets rejected from NeurIPS 2021

Do you want to learn Python, Data Science, and Machine Learning while getting certified? Here are some best selling Datacamp courses that we recommend you enroll in:

  1. Introduction to Python (Free Course) - 1,000,000+ students already enrolled!
  2. Introduction to Data Science  in Python- 400,000+ students already enrolled!
  3. Introduction to TensorFlow for Deep Learning with Python - 90,000+ students already enrolled!
  4. Data Science and Machine Learning Bootcamp with R - 70,000+ students already enrolled!

Leave a Comment