d777: 2018 Stanford CS224n NLP course projects

 2018 Stanford CS224n NLP course projects – learn more | reddit discussion


Prize Winners

Prizes Round 1 (based on the poster session)

Custom Projects

  1. Attention, I’m Trying to Speak: Speech Synthesis by Akash Mahajan
  2. Attention On Attention: Architectures for Visual Question Answering (VQA) by Jasdeep Singh and Vincent Ying
  3. Word2Bits – Quantized Word Vectors by Maximilian Lam

Default Projects

  1. Question Answering On SQuAD Dataset by Junjie Dong, Zihuan Diao, and Jiaxing Geng
  2. Adversarial SQuAD by Amita Kamath and Akhila Yerukola
  3. Machine Comprehension Using Bidirectional Attention by Amirhossein Kiani, Behrooz Ghorbani

Audience Selection Prize

  1. Exploring and Mitigating Gender Bias in GloVe Word Embeddings by Francesca Vera

Prizes Round 2 (based on the reports)

Custom Projects

  1. Yup’ik Eskimo and Machine Translation of Low-Resource Polysynthetic Languages by Christopher Liu and Laura Domine
  2. Pragmatic Training for Reference Games by Bill McDowell
  3. Representing Words with Only Subword Information by Alexia Wenxin Xu

Default Projects

  1. Iterative reasoning with bi-directional attention flow for machine comprehension by Anand Dhoot and Anchit Gupta
  2. Diverse Ensembling for Question Answering by Ben Cohen-Wang and Edward Lee
  3. Question Answering On SQuAD by Cagan Alkan and Beliz Gunel

Custom Projects

Project Name Authors
Visual Question Answering Stefanie Anna Baby, Ashwini Pokle
Pointer-Generator Network Summarization On TextRank-Preprocessed Documents Long Viet Tran
Automatic Lyrics-Based Music Genre Classification Zhao Kezhen, Ruoxi Zhang, Peiling Lu
Exploring neural architectures for NER Vincent Billaut, marcthib@stanford.edu
Neural Abstractive Summarization On Gigaword Chenduo Huang, Matthew Donghyun Kim
Deep Learning Approaches to Classifying Types of Toxicity in Wikipedia Comments Howard Small, Ashe Marie Magalhaes
Exploring Deep Learning in Combating Internet Toxicity Sushan Bhattarai
Playlist Title Generation Using Sequence to Sequence Sofia Samaniego De La Fuente
Detecting Depression Through Tweets Aileen Wang, Diveesh Singh
Predicting Myers-Briggs Type Indicator With Text Ian Knight
Dank Learning: Generating Memes Using Deep Neural Networks Lawrence Peirson, Emine Meltem Tolunay
Language Modeling with Generative Adversarial Networks Mehrad Moradshahi, Utkarsh Contractor
Toxic Comment Categorization using Bidirectional LSTMs with Attention Anthony Ho, Michael Anthony Baumer
Movie Recommedation System Enhanced by Natural Language Processing Jiaxi Chen, Ziran Zhang
Predicting Company Ratings through Glassdoor Reviews Fabian Frederik Frank, Tyler Whittle
State-of-the-art abstractive summarization Stelios Serghiou, Apurva Pancholi, Peter Li
Learning a New(s) Model: An exploration of LSTM classification and Language Modeling of News Articles Luladay Price, Stephone Christian
When Glove meets GAN: Adversarial Language Generation Using Dense Vector Embeddings Junkyo Suh, Edwin Yuan, Manish Pandey
Paying attention to toxic comments online Emily Ellis Kuehler, Manav Kohli, John Palowitch
Cache Attention for Recurrent Language Modeling Colin Gaffney
Training Dialog Agents to Negotiate Kerem Goksel
Predicting the Side Effects of Drugs Camilo Ruiz
Never Stop Learning Lawrence Stratton
Stacked Attention for Visual Question Answering Bingbin Liu, Weini Yu
Are you sure of your answer? Think again. Lakshmi Manoharan, Arjun Parthipan
End-to-End Task-Oriented Dialogue Agents Derek Chen
Neural Models for Email Response Prediction Tucker James Leavitt
CS224N: Detecting and Classifying Toxic Comments Kevin Dara Khieu, Neha Narwal
Conditional MaskGAN and evaluation via classification Hanoz Bhathena, Renke Cai
Shakespeare and Satoshi – De-anonymizing Writing Using BiLSTMs with Attention Varun Ramesh, Jean-Luc Watson
Deep Neural Networks for Added Emphasis Jon Kotker, Niraj Jayant
Def2Vec: Learning Word Vectors from Definitions Tony Duan, Andrey Kurenkov
Classy Classification: Classifying and Generating Expert Wine Review Frederick William Lawrence Robson, Loren Karl Amdahl-Culleton
Colors in Context: An Implementation Alec Joseph Brickner
Using General Adversarial Networks for Marketing: A Case Study of Airbnb John Kamalu, Richard Diehl Martinez
IMAGAN: Learning Images from Captions Samir Sen, Trevor Tsue, Karan Singhal
ERASeD: Exposing Racism And Sexism using Deep Learning Jayadev Bhaskaran, Suvadip Paul
Generalizing word vectors in a multi model approach Kareem Hegazy
Predicting Gender of Poets with Deep Learning Methods Samuel Mignot
Toxic Comment detection with bi-directional LSTM Xiaoyan Wu
Predicting and Generating Discussion Inspiring Comments Yunhe Wang, Junwon Park
Solving Math Word Problems Ryan Anthony Wong
Improving the Neural Dependency Parser Chuanbo Pan, Jeffrey Barratt, Shane Barratt
Natural Language Guided Reinforcement Learning for Playing Snake in Arbitrary Dimensions Alexander Seutin
Predicting Funny Yelp Reviews Christina Ashley Pan, John Martin Poothokaran
Generating SQL queries from natural language Ikshu Bhalla, Archit Gupta
Context is Everything: Finding Meaning Statistically in Semantic Spaces Eric Zelikman
Unsupervised Domain Adaptation for Sentiment Classification using Pseudo-Labels Ruishan Liu, Liyue Shen
Predicting Entailment through Neural Attention and Binary Parsing Natalie Ng, Matthew Jay Katzman, Christina Montefalcon Ramsey
Dialogue Generation using Reinforcement Learning and Neural Language Models Carson K Lam, Marcella Cindy Prasetio

Default Projects

Project Name Authors
Step by Step approach to build a model for SQuAD Anjan Dwaraknath
Answering Questions with CharCNN and Bi-directional Attention Flow Connie Xiao, Cindy Ding Jiang
Question Answering System with Deep Learning Jake Spracher, Takahiro Fushimi, Robert M Schoenhals
Character CNN and Self—Attention for SQuAD Jianqing Yang
Recurrent Neural Networks with Attention for Question Answering Ben Hannel
BiDAF-inspired Preferential Multi-perspective Matching for Question Answering Task Yuxing Chen, Kexin Yu
Reading Comprehension with SQuAD Dataset Wei Kang
Improved Question Answering On the SQuAD Dataset Using Attention Mechanisms Kelly Shen
Attention Mechanism in Machine Comprehension Yingnan Xiao
Conditioning LSTM Decoder and Bi-directional Attention Based Question Answering System Heguang Liu
Attention-Based Neural Network For Question Answering Zhengyang Tang, Songze Li
Evaluating Different Techniques On SQuAD Xinyu Xu, Zhangyuan Wang
R-NET with BiDAF for Reading Comprehension Jingwei Ji, Zibo Gong
Bidirectional attention flow for Question Answering Ana-Maria Istrate
The Quest for High-Performance Question Answering Neural Net Models Lauren Blake
A Comparison of RNN and Transformer—based Question Answering Systems Adam Jensen, Diana Moncoqut
Ensemble Leaning for Stanford Question Answering Challenge Yuzhou Liu
R—NET based Neural Network for Machine Reading Comprehension abhishek bharani, VenkataBalaji Kollu
Bi-Directional Attention & Self Attention for SQUAD LI DENG, Zhiling Huang
Machine Reading Comprehension On the SQuAD Dataset Using R-NET Jason Mian Luo, William Zeng
Improving SQuAD Baseline Using BiDAF Refinements and Experimenting with Semi-Supervised Learnin Allison Koenecke, Varun Vasudevan
SQuAD GOALS Guided Objective Advanced Learning System Derek Phillips
Question Answering using BiDAF and DrQA Fu Rui, Xuan Yang
Question Answering with Attentions Ensemble Zihan Lin, Jason Zhu, Teng Zhang
Q&A on the SQuAD dataset Matej Kosec, Liz Wen Yao
A Study of Attention in Deep Learning Models for Question Answering William Locke
Question Answering On the SQuAD Dataset Stephanie Vincci Tang, Ivan Suarez Robles
High Performance SQuAD and Transfer Learning Alexandre Gauthier, Jeff Chen
Question Answering on SQuAD Dataset with BiDAF and Self-Attention Junwei Yang
Improved Question Answering on the SQuAD Dataset Using Attention Mechanisms Vincent Sheu
Reading Comprehension Neutral Network with Attention and Post Attention Modeling. Xu Zhao, Zhi Liu
CS224N Default Final Project Write-Up Mark Holmstrom
Paying Attention to SQuAD: Exploring Bidirectional Attention Flow Lucy Li, Heather Rae Blundell
Machine Comprehension on SQuAD BiDAF vs Coattention Minh-An Quinn
Question Answering with Attention Stephanie Dong, Ziyi Li
An Exploration of Question-Answering Modules Margaret Gan Guo, Wen Torng
An Ensemble Model for SQuAD Yuze He, Priyanka Dwivedi
The SQuAD Challenge – Machine Comprehension on the Stanford Question Answering Dataset Rohit Prakash Apte
Question Answering using Bidirectional Attention Flow and Co-Attention Apoorva Dornadula, Parth Shah
Question Answering Omar Sow
Exploring Techniques for Neural Question Answering Gabriel Bianconi, Mahesh Agrawal
Question answering using weighted-loss Bi-Directional Attention Flow on SQuAD Dataset mengjiec@stanford.edu
An Exploration of State of the Art Techniques for Question Answering Systems James Payette
Reading Comprehension using Bi-Directional Attention Network Pratik Kumar, Neel Mani Singh
Neural Question Answering Aneesh Pappu, Rohun Saxena
Combining Bidirectional Attention Flow and Attention Pooling Pointer Networks for High Performance on the SQuAD Challeng Kiko Ilagan, Anoop Manjunath
Reading Comprehension and Question Answering with Bidirectional Attention Flow Andrew Huang, Michael Ko
Investigations in Question Answering Architectures Patrick Cho, Sudarshan Seshadri
The Impact of Attention Mechanisms on Question Answering Performance Joe M Paggi, Benjamin Parks
Machine Comprehension on SQuAD BiDAF vs Coattention Ramin Ahmari
Towards an Integrated QA Model Fangzhou Liu
Implementation of R-NET Machine Comprehension Model for Question Answering Sabarish Sankaranarayanan
Question Answering on the SQuAD Dataset Laëtitia Shao, Ben Zhou
Applying Bi-Directional Attention Flow to SQuAD Jestin Ma, Jialin ding
A Bidirectional Attention-Based Approach to Machine Comprehension and Question Answering Kevin Chen
Question Answering System with Question Type Modelling Ksenia Ponomareva
Machine Comprehension on SQuAD using Bi-Directional Attention Flow Daisy Ding
CS224N: Question-Answering Utilizing Bidirectional Attention Flow Wesley Chan Olmsted, Trevor Danielson
R-Net with Multiplicative Attention Rooz Mahdavian, Pierce Barrett Freeman
Question Answering with Coattention Encoding and Answer Pointer Network Yinghao Xu
CS224N Final Report Allen Zhao, Dirk Hofland
Computational Reading Comprehension through Self-Attention and Convolution Neil Movva, Samir Menon
Exploring Attention Mechanisms for Reading Comprehension Noah Makow
Machine Comprehension Systems on SQuAD Dataset Megha Jhunjhunwala, Shantanu Thakoor
Question Answering Using Bi-Directional Attention Flow with Position Encoder Denis
C8224N Final SQuAD Improvements Ryan Almodovar, VIVEK MISRA
SQuAD Challenge Matthew Creme, Mackenzie James Pearson, Raphael Lenain
SQuAD With LSTM and BiDAF Ethson Villegas, Danielle Siy
Combining Attention Approaches for the SQuAD Challenge Luke James Blackshaw Asperger
Multi-layer GRU using character level information for SQUAD challenge Jake Yoon
Machine Comprehension with BiDAF Shim Young Lee
Lightweight Convolutional Approaches to Reading Comprehension for SQuAD Ben Penchas, Tobin Bell
Question Answering System with Bidirectional Attention Flow Hsu-kuang Chiu, Ting-Wei Su
Bidirectional Attention Flow with LSTM Networks for Question Answering James Li
Bi-Directional Attention Flow and CO-Attention Models for Question Answering on the SQuA Rafael Musa
Analyzing Modeling Layers for the SQuAD Challenge Tim Anderson
CS224N SQuAD Challenge with Bidirectional Attention Flow and Context Features David Lee-Heidenreich, Adrien Truong
A Bi-directional Attention Flow Model for the SQuAD Dataset Cody Keola Kala, Horace Chu
Machine Comprehension with BiDAF and Answer Pointer Zehui Wang, Xiaoxue Zang
Question Answering with Deep Learning Jackie Yau, Hao Wu
Reading Comprehension on the SQuAD Dataset David Xue, Bill Zhu (Legal Name: Zheqing Zhu)
Machine Comprehension on SQuAD using Bi-Directional Attention Flow Ruohan Zhan
An Approach to Machine Reading Comprehension on SQuAD Jiafu Wu, Alan Flores-Lopez
Question Answering With Deep Bidirectional Attention Flow and FusionNet Silviana Ciurea-Ilcus, Michal Kim Wegrzynski
SQUAD Challenge : A Hybrid Model for Question Answering Onur Cezmi Mutlu, Hacer Umay Geyikci
Natural language Question Answering using Curriculum Learning Abhijeet Shenoi, Aarti Bagul
Exploring Attention in Question Answering Models Anav Sood, Ethan Zi-Yu Shen
A Multi-Attention Reading Comprehension Model for SQuAD Dataset Shuyang Shi, Tong Yang
Deep Question Answering on SQuAD Mitchell Dawson
CS224N Final Project SQUAD Challenge Saahil Agrawal, Nicholas Johnson
SQuAD Challenge using BiLSTM and Bidirectional Attention Flow Mojtaba Sharifzadeh
SQuAD Model Exploration: BiDAF and Input Feature Ben Barnett, Jeffrey Dong Chen
Question Answering with the SQuAD Wayne Lu
Question Answering on the SQuAD Dataset Yongshang Wu, Hao Wang
Building a Question Answering System with a Character Level Convolutional Neural Network and Attention Layers — Is it a good idea? Praveen Govindraj
Pay More Attention: Neural Architectures for Question-Answering Zia Hasan, SebastianFiscer
Question Answering model using BiDAF Shawn Hu, Ran Gross
Reading Comprehension with Neural Networks Andrew Weitz
Question Answering on the SQuAD Dataset Hyun Sik Kim
Question answering on the SQuAD dataset with bidirectional attention flow Brahm Capoor, Varun Nambikrishnan
Question Answering with Bi-directional Attention and Character Embedding Yuting Sun, xiangcao liu
Ask BiDAF Mitchell Douglass, Caelin Tran, Griffin Slade Koontz
A Bi-directional Attention Flow (BiDAF) Model for the Stanford Question Answering Dataset (SQUAD) Charles Hale, Helen Xiong
Question Answering on SQuAD Jake Smola, Evan
Coattention-Based Neural Network for SQuAD Question Answering Xizhi Han, Yue Hui
Bi-Directional Attention and Beyond: Double BiDAF with Residual Connections for Question Answering Pedro Montebello Milani
Machine Reading Comprehension on SQuAD with Relevance Encoder Feng Liu, Qixiang Zhang
Question Answering Bowen Deng
Question Answering with Various Attention Mechanisms Yinghao Sun
Exploring speed and memory trade-offs for achieving optimum performance on SQuAD dataset Renat
Question Answering System with the Dynamic Coattention Network Yi Sun
A Modular Architecture for Machine Comprehension William Arthur Clary
Reading Comprehension with the SQuAD Hugo Andres Valdivia, Miguel Garcia
Machine Comprehension with Bi-directional and Self—attention Flow Ji Yu, Tianpei Qian
CS224N Project Report: Bidirectional Attention Flow and Self Attention Mechanisms for Machine Comprehensio Jervis Jerome Muindi, Richard Ruiqi Yang
SQuAD Reading Comprehension Task – C8224n Final Project Adva Wolf
Machine Reading Comprehension On SQuAD Tian Tan, Don Mai
A Hybrid Deep Learning System for Machine Comprehension Gang Wu
A Deep Learning System for the Stanford Question Answering Dataset (SQuAD) AmirMahdi Ahmadinejad
Machine Learning Optimization for SQuAD Julian Sinohe Villalpando
Final Project Report: SQuAD Winston Taojie Wang, Michael Chung
Question Answering System on the Stanford Question Answering Dataset (SQuAD) Richard Akira Heru
Final default project: questions answering with deep learning Denis Ulanov, David Uvalle
Exploring Deep Learning Solutions for Question-Answering and Reading Comprehension Tasks Rodrigo Grabowsky, Kimberly Wijaya
BiDirectional Attention for Machine Comprehension Anand Venkatesan, Ananthakrishnan Ganesan
Replicating Advances in Question-Answering with Deep Learning and Complex Attention Stuart Cornuelle
Reading Comprehension On SQuAD: An Insight into BiDAF Vivekkumar Patel, Shreyash Pandey
CS 224N Default Final Project: Question Answering Raghunath Krishnamurthy
The SQuAD Challenge – results of several Rajeeva Gaur, satyam
Question Answering with Hybrid Attention Network Yicheng Li, Xiuye Gu
Machine Comprehension using Deep Learning Sharman W Tan, Henry Lin
Machine Reading Comprehension for the SQuAD Dataset using Deep Learning Chung Fat Wong
Question Answering with Bi-directional Attention Flow and Self—Attention Olivier Pham, Yuguan Xing
Question Answering On the SQuAD Dataset Jonas Shomorony
Default Final Project for C8224N – Self—Attention Jaak Joonas Uudmae
GE-BiDAF: A Question Answering model for SQuAD binbin xiong, Minfa Wang
SquaD Reading Comprehension Xinyi Jiang
Machine Comprehension using BiDAF Bimal Parakkal