Skip to content

Commit

Permalink
fix bug for issue#38
Browse files Browse the repository at this point in the history
  • Loading branch information
scutpaul committed Nov 6, 2023
1 parent 1012d3a commit d586880
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion lvdm/modules/attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,7 @@ def forward(self, x, context=None, mask=None):
del k_ip
sim_ip = sim_ip.softmax(dim=-1)
out_ip = torch.einsum('b i j, b j d -> b i d', sim_ip, v_ip)
out_ip = rearrange(out, '(b h) n d -> b n (h d)', h=h)
out_ip = rearrange(out_ip, '(b h) n d -> b n (h d)', h=h)
out = out + self.image_cross_attention_scale * out_ip
del q

Expand Down

0 comments on commit d586880

Please sign in to comment.