Skip to content

Commit

Permalink
disabling torch.jit.script here for massive performance boost when us…
Browse files Browse the repository at this point in the history
…ing torch.compile, our default. see issue karpathy#11. thanks @vgoklani for flagging
  • Loading branch information
karpathy committed Jan 2, 2023
1 parent ea4de19 commit 177d5f7
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions model.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,8 @@
import torch.nn as nn
from torch.nn import functional as F

@torch.jit.script
def fused_gelu(x):
# @torch.jit.script # good to enable when not using torch.compile, disable when using (our default)
def new_gelu(x):
"""
Implementation of the GELU activation function currently in Google BERT repo (identical to OpenAI GPT).
Reference: Gaussian Error Linear Units (GELU) paper: https://arxiv.org/abs/1606.08415
Expand Down Expand Up @@ -71,7 +71,7 @@ def __init__(self, config):

def forward(self, x):
x = self.c_fc(x)
x = fused_gelu(x)
x = new_gelu(x)
x = self.c_proj(x)
x = self.dropout(x)
return x
Expand Down

0 comments on commit 177d5f7

Please sign in to comment.