We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我在使用TransformerEncoderLayer这个api的activation参数时,我需要传入leakyrelu激活函数,但是大佬你知道的,这个激活函数是具有参数的,也就是你们提供的F.leaky_relu或者是nn.LeakeyReLU激活函数的negative_slop参数。 目前这个参数只能传入不带参数的激活函数,或者带默认参数的激活函数。
如何将设置了negative_slop参数的leakyrelu也传入给TransformerEncoderLayer的activation。
No response
The text was updated successfully, but these errors were encountered:
您好,目前这个api无法传递激活函数参数,只能使用默认值
Sorry, something went wrong.
我这是个需求大佬,后面更新能不能考虑加上一个这个需求😂
这个我和相关同学反馈一下吧,看他们怎么安排
多谢
import paddle from paddle.nn import TransformerEncoderLayer enc_input = paddle.rand((2, 4, 128)) attn_mask = paddle.rand((2, 2, 4, 4)) encoder_layer = TransformerEncoderLayer(128, 2, 512) act = paddle.nn.LeakyReLU(negative_slope=0.01) encoder_layer.activation = act
可以这样自定义
可以可以,牛的大佬们 十分感谢,反应很迅速 @FrostML @YuanRisheng
YuanRisheng
No branches or pull requests
需求描述 Feature Description
任务目标:
我在使用TransformerEncoderLayer这个api的activation参数时,我需要传入leakyrelu激活函数,但是大佬你知道的,这个激活函数是具有参数的,也就是你们提供的F.leaky_relu或者是nn.LeakeyReLU激活函数的negative_slop参数。
目前这个参数只能传入不带参数的激活函数,或者带默认参数的激活函数。
需求:
如何将设置了negative_slop参数的leakyrelu也传入给TransformerEncoderLayer的activation。
替代实现 Alternatives
No response
The text was updated successfully, but these errors were encountered: