CS224d: Deep Learning for Natural Language Processing (Spring 2016)
Richard Socher and James Hong and Sameep Bagadia and David Dindi and B. Ramsundar and N. Arivazhagan and Qiaojing Yan

folder cs224d (9 files)
fileCS224D Lecture 6 - 14th Apr 2016.mp4 476.64MB
fileCS224D Lecture 10 - 28th Apr 2016.mp4 453.86MB
fileCS224D Lecture 5 - 12th Apr 2016.mp4 444.40MB
fileCS224D Lecture 2 - 31st Mar 2016.mp4 450.86MB
fileCS224D Lecture 8 - 21st Apr 2016.mp4 443.25MB
fileCS224D Lecture 3 - 5th Apr 2016.mp4 388.37MB
fileCS224D Lecture 4 - 7th Apr 2016.mp4 423.65MB
fileCS224D Lecture 7 - Introduction to TensorFlow (19th Apr 2016).mp4 383.57MB
fileCS224D Lecture 1 - 29th Mar 2016.mp4 364.74MB
Type: Course
Tags: nlp, deep learning, cs224d

Bibtex:
@article{,
title= {CS224d: Deep Learning for Natural Language Processing (Spring 2016)},
keywords= {nlp, deep learning, cs224d},
journal= {},
author= {Richard Socher and James Hong and Sameep Bagadia and David Dindi and B. Ramsundar and N. Arivazhagan and Qiaojing Yan},
year= {2016},
url= {},
license= {},
abstract= {Natural language processing (NLP) is one of the most important technologies of the information age. Understanding complex language utterances is also a crucial part of artificial intelligence. Applications of NLP are everywhere because people communicate most everything in language: web search, advertisement, emails, customer service, language translation, radiology reports, etc. There are a large variety of underlying tasks and machine learning models powering NLP applications. Recently, deep learning approaches have obtained very high performance across many different NLP tasks. These models can often be trained with a single end-to-end model and do not require traditional, task-specific feature engineering. In this spring quarter course students will learn to implement, train, debug, visualize and invent their own neural network models. The course provides a deep excursion into cutting-edge research in deep learning applied to NLP. The final project will involve training a complex recurrent neural network and applying it to a large scale NLP problem. On the model side we will cover word vector representations, window-based neural networks, recurrent neural networks, long-short-term-memory models, recursive neural networks, convolutional neural networks as well as some very novel models involving a memory component. Through lectures and programming assignments students will learn the necessary engineering tricks for making neural networks work on practical problems.},
superseded= {},
terms= {}
}


Send Feedback Start
   0.000007
DB Connect
   0.000390
Lookup hash in DB
   0.000712
Get torrent details
   0.000624
Get torrent details, finished
   0.000655
Get authors
   0.000091
Parse bibtex
   0.000518
Write header
   0.000692
get stars
   0.000469
home tab
   0.028689
render right panel
   0.000043
render ads
   0.000076
fetch current hosters
   0.015256
Done