Skip to content

Commit

Permalink
fix remat checkpoint of input
Browse files Browse the repository at this point in the history
  • Loading branch information
samos123 committed Nov 21, 2024
1 parent 45d769d commit 408033d
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion axlearn/common/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -3159,7 +3159,7 @@ def _forward_for_mode(
Raises:
ValueError: If `mode` is unsupported.
"""
self._remat_name(data, "input")
data = self._remat_name(data, "input")
self.vlog(3, "transformer.input=%s", data.sum())
self_attention_return_aux = set()
cross_attention_return_aux = set()
Expand Down

0 comments on commit 408033d

Please sign in to comment.