Wang

  • Iyyer et al (2015) demonstrate that multilayer feed-forward networks can provide competitive results on sentiment classification and factoid question answering. Deep Unordered Composition Rivals Syntactic Methods for Text Classification
  • Karpathy, A., Johnson,Visualizing and Understanding Recurrent Networks
  • A Convolutional Neural Network for Modelling Sentences
  • Wang, X., Liu, Y., SUN, C., Wang, B., & Wang, X. (2015b). Predicting Polarities of Tweets by Composing Word Embeddings with Long Short-Term Memory

idea

1.使用双向RNN以使用双向的信息? Bidirectional LSTM (Graves and Schmidhuber, 2005b) 2.使用PRELU?http://www.tuicool.com/articles/NBBbamm 3.换激活函数做对比,换LSTM结构做对比 4.做完relu后加一步batch normalization尽可能保证每一层网络的输入具有相同的分布(Ioffe S, Szegedy C. Batch normalization: Accelerating deep network training by reducing internal covariate shift)

亮点

使用RNN-》序列标注+为使用分布表示的序列创造了可能从而可以使用词向量 使用ReLU和dropout 对比实验 使用word embedding

BiLSTM https://github.com/farizrahman4u/seq2seq

我给的是我平时看到的觉得非常不错的关于LSTM(RNN)不错的网站或者入门论文

  1. Understanding LSTM Networks: http://colah.github.io/posts/2015-08-Understanding-LSTMs/
  2. RNN 系列教程:http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/
  3. 神经网络与NLP入门资料(附件) 除了这篇比较入门级的文章,还有一个网站也比较好: http://www.wildml.com/2015/09/recurrent-neural-networks-tutorial-part-1-introduction-to-rnns/ 的part1, 2, 3, 4可以看一看,还配有theano的例子。

LSTM Networks for Sentiment Analysis:http://deeplearning.net/tutorial/lstm.html#lstm UFLDL教程:http://ufldl.stanford.edu/wiki/index.php/%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C 循环神经网络(RNN, Recurrent Neural Networks)介绍 http://blog.csdn.net/heyongluoyao8/article/details/48636251 基础NN:http://www.wildml.com/2015/09/implementing-a-neural-network-from-scratch/ 特征选择:http://machinelearningmastery.com/discover-feature-engineering-how-to-engineer-features-and-how-to-get-good-at-it/

词向量

神经网络特性

激活函数

LSTM

keras

资料

FNNs(Feed-forward Neural Networks,前向反馈神经网络)

  • 输入层结点个数与输入数据的维度一致
  • 输出层结点个数与分类个数一致
  • 隐藏层节点数越多拟合能力越好也越容易过拟合
  • 每个隐藏层结点有一个激活函数( tanh, the sigmoid function, or ReLUs,softmax)。激活函数的特点在于导数好求且可利用原函数求得。
  • 损失函数:\begin{aligned} L(y,\hat{y}) = - \frac{1}{N} \sum{n \in N} \sum{i \in C} y{n,i} \log\hat{y}{n,i} \end{aligned}

results matching ""

    No results matching ""