首页> 外文会议>Conference on empirical methods in natural language processing >Convolutions Are All You Need (For Classifying Character Sequences)
【24h】

Convolutions Are All You Need (For Classifying Character Sequences)

机译:卷积是您所需要的(用于分类字符序列)

获取原文

摘要

While recurrent neural networks (RNNs) are widely used for text classification, they demonstrate poor performance and slow convergence when trained on long sequences. When text is modeled as characters instead of words, the longer sequences make RNNs a poor choice. Convolutional neural networks (CNNs), although somewhat less ubiquitous than RNNs, have an internal structure more appropriate for long-distance character dependencies. To better understand how CNNs and RNNs differ in handling long sequences, we use them for text classification tasks in several character-level social media datasets. The CNN models vastly outperform the RNN models in our experiments, suggesting that CNNs are superior to RNNs at learning to classify character-level data.
机译:虽然经常性神经网络(RNN)广泛用于文本分类,但它们在长序列培训时表现出差的性能和缓慢的收敛。当文本被建模为字符而不是单词时,较长的序列会使RNN选择差。卷积神经网络(CNNS),虽然稍微少于RNN,但具有更适合长距离字符依赖性的内部结构。为了更好地了解CNN和RNN在处理长序列中的不同之处,我们将它们用于多个字符级社交媒体数据集中的文本分类任务。 CNN模型在我们的实验中大大优于RNN模型,表明CNNS在学习中优于RNN,以分类字符级数据。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
获取原文

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号