深度长文:NLP的巨人肩膀(上):https://www.jiqizhixin.com/articles/2018-12-10-17

NLP 的巨人肩膀(下):从 CoVe 到 BERT: https://www.jiqizhixin.com/articles/2018-12-17-17?from=synced&keyword=NLP%E7%9A%84%E5%B7%A8%E4%BA%BA%E8%82%A9%E8%86%80

 

图解2018年领先的两大NLP模型:BERT和ELMo:https://mp.weixin.qq.com/s?__biz=MzI3MTA0MTk1MA==&mid=2652033813&idx=3&sn=ba5712022bca369e2fd8542fececaa57&scene=0#wechat_redirect

NLP的游戏规则从此改写?从word2vec, ELMo到BERT:https://www.jiqizhixin.com/articles/2018-12-24-19?from=synced&keyword=elmo

ELMo最好用词向量Deep Contextualized Word Representations:https://zhuanlan.zhihu.com/p/38254332

BERT模型学习与分析:https://www.jianshu.com/p/160c4800b9b5

分类:

技术点:

相关文章: