Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

assignment #4: Decode阶段softmax计算量很大,有解决办法吗? #18

Open
BobOfRivia opened this issue Feb 26, 2020 · 3 comments

Comments

@BobOfRivia
Copy link
Contributor

BobOfRivia commented Feb 26, 2020

一个句子一轮decode ,Beam Search 阶段的时间复杂度应该是:k·step·|V|
训练迭代很慢,有解决办法吗?
可否像skip-gram的负采样一样,修改目标函数

@1024er
Copy link

1024er commented Mar 1, 2020

想要加速的需要在生成过程中避免对归一化分母的计算

@1024er
Copy link

1024er commented Mar 8, 2020

这里我推荐一个去年冯洋老师在将门的一期分享,讲的就是对NMT的训练改进和解码的加速。搬运一下录像和和PPT:

#将门技术社群线上分享第176期#

中科院计算所副研究员冯洋:神经机器翻译的训练改进和解码提速

网盘>>https://pan.baidu.com/s/1py_RxX0RaF9AcA_L-fccWQ 提取码:gyym
B站>>https://www.bilibili.com/video/av74001189/

@BobOfRivia
Copy link
Contributor Author

这里我推荐一个去年冯洋老师在将门的一期分享,讲的就是对NMT的训练改进和解码的加速。

#将门技术社群线上分享第176期#

中科院计算所副教授冯洋:神经机器翻译的训练改进和解码提速

网盘>> https://pan.baidu.com/s/1py_RxX0RaF9AcA_L-fccWQ提取码:gyym
B站>> https://www.bilibili.com/video/av74001189/

万分感谢,我去看一下

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants