Skip to content

sbmaruf/ReadingMaterial-sbmaruf

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Reading lists.

Blogs

  1. A Recipe for Training Neural Networks, andrej karpathy.
  2. The Unreasonable Effectiveness of Recurrent Neural Networks, andrej karpathy
  3. Recurrent Neural Network. Link1, Link2, Link3, Link4
  4. Chris Olah's blog Link

... (Need to list a lot) (TODO)

Contexual Representation

  1. Pennington et al, GloVe: Global Vectors for Word Representation Link
  2. Mikolov et al, Distributed Representations of Words and Phrases and their Compositionality Link
  3. Lample et al, Word translation without parallel data, Link
  4. Peters et al, ELMo: Deep contextualized word representations Link
  5. Radford et al, Improving language understanding by generative pre-training Link
  6. Devlin et al, BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, Link

Cross-lingual-multitask

  1. Guillaume Lample et al, Word Translation Without Parallel Data. Link
  2. Xie et al, Neural Cross-Lingual Named Entity Recognition with Minimal Resources Link

Domain Adaptation

  1. Ganin et al, Domain-Adversarial Training of Neural Networks. Link Slide1 Slide2
  2. Tzeng et al, Adversarial Discriminative Domain Adaptation. Link Slide
  3. Riccardo et al, Adversarial Feature Augmentation for Unsupervised Domain Adaptation. Link Slide

GANs

  1. Goodfellow et al, Generative Adversarial Networks. Link Slide
  2. Arjovsky et al, Wasserstein GAN. Link
  3. Arjovsky et al, Towards Principled Methods for Training Generative Adversarial Networks. Link
  4. Salimans et al, Improved Techniques for Training GANs. Link
  5. Miyato et al, Spectral Normalization for Generative Adversarial Networks. Link
  6. Zhu et al, (Cycle-GAN) Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks. Link
  7. Zhao et al, Learning Sleep Stages from Radio Signals: A Conditional Adversarial Architecture. Link Slide
  8. Zhang et al, Aspect-augmented Adversarial Networks for Domain Adaptation. Link Slide
  9. Chen et al, Adversarial Deep Averaging Networks for Cross-Lingual Sentiment Classification.Link
  10. Guillaume Lample et al, Word Translation Without Parallel Data. Link

Misc

  1. Collobert et al, Natural Language Processing (Almost) from Scratch. Link
  2. Vinyals et al, Pointer Net. Link

NER

  1. Lample et al, Neural Architectures for Named Entity Recognition. Link
  2. Strubell et al, Fast and Accurate Entity Recognition with Iterated Dilated Convolutions, Link Slide
  3. Robust Multilingual Part-of-Speech Tagging via Adversarial Training. Link
  4. Xie et al, Neural Cross-Lingual Named Entity Recognition with Minimal Resources Link

NMT

  1. Cho et al, Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation. Link Slide
  2. Sutskever et al, Sequence to Sequence Learning with Neural Networks. Link Slide
  3. Badanau et al, Neural Machine Translation by Jointly Learning to Align and Translate. Link Slide
  4. Luong et al, Effective Approaches to Attention-based Neural Machine Translation. Link Slide
  5. Vaswani et al, Attention Is All You Need Link
  6. Wu et al, Pay Less Attention with Lightweight and Dynamic Convolutions Link
  7. Sennrich et al, Neural Machine Translation of Rare Words with Subword Units. Link

RNN

  1. Greff et al, LSTM: A Search Space Odyssey, Link
  2. Shen et al, Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks. Link

Tree structure

  1. Shen et al, Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks. Link

Murphy's book

  1. Generative Models for Discrete Data - (From the book of Kevin Murphy, chapter 3), Link

About

These are some papers and quality blogs that I read.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published