Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TransformerEncoderLayer中的activation激活参数需求。内容很简短烦请请大佬赐教。 #70385

Open
sealoongleft opened this issue Dec 20, 2024 · 6 comments
Assignees
Labels

Comments

@sealoongleft
Copy link

sealoongleft commented Dec 20, 2024

需求描述 Feature Description

任务目标:

我在使用TransformerEncoderLayer这个api的activation参数时,我需要传入leakyrelu激活函数,但是大佬你知道的,这个激活函数是具有参数的,也就是你们提供的F.leaky_relu或者是nn.LeakeyReLU激活函数的negative_slop参数。
目前这个参数只能传入不带参数的激活函数,或者带默认参数的激活函数。

需求:

如何将设置了negative_slop参数的leakyrelu也传入给TransformerEncoderLayer的activation。

替代实现 Alternatives

No response

@YuanRisheng
Copy link
Contributor

您好,目前这个api无法传递激活函数参数,只能使用默认值

@sealoongleft
Copy link
Author

我这是个需求大佬,后面更新能不能考虑加上一个这个需求😂

@YuanRisheng
Copy link
Contributor

这个我和相关同学反馈一下吧,看他们怎么安排

@sealoongleft
Copy link
Author

多谢

@FrostML
Copy link
Contributor

FrostML commented Dec 23, 2024

import paddle
from paddle.nn import TransformerEncoderLayer

enc_input = paddle.rand((2, 4, 128))
attn_mask = paddle.rand((2, 2, 4, 4))
encoder_layer = TransformerEncoderLayer(128, 2, 512)

act = paddle.nn.LeakyReLU(negative_slope=0.01)
encoder_layer.activation = act

可以这样自定义

@sealoongleft
Copy link
Author

sealoongleft commented Dec 23, 2024

可以可以,牛的大佬们
十分感谢,反应很迅速
@FrostML @YuanRisheng

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants