torch-multi-head-attention

Multi-head attention implemented in PyTorch

Installation

In a virtualenv (see these instructions if you need to create one):

pip3 install torch-multi-head-attention

Releases

Version Released Bullseye
Python 3.9
Bookworm
Python 3.11
Files
0.15.1 2019-02-24  
0.15.0 2019-02-21  

Issues with this package?

Page last updated 2025-07-17 21:58:56 UTC