작성자 : 김승일(모두의연구소 연구소장)

발표일 : 20201012 at DeepLAB 논문반

학습자료


논문 링크

함께 보면 좋을 참고자료

Attention Is All You Need

The Illustrated Transformer

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)

논문 요약