Skip to content

Conversation

@emergenz
Copy link

Adding batch_first as an __init__ argument of MultiHeadAttention is just a quickfix since we are ignoring it.

It does the job, though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant