Skip to content

Implementing GELU activation #20464

@ghost

Description

Gaussian Error Linear Unit (GELU) has become really mainstream with models such as BERT using it. But nn.modules.activation doesn't have a GELU implementation. Feature request to expand the arsenal of activation functions available right off the bat in Pytorch :)

I could work on it, to get it merged upstream if assigned.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNot as big of a feature, but technically not a bug. Should be easy to fixmodule: nnRelated to torch.nntriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions