torch-multi-head-attention
Multi-head attention implemented in PyTorch
Installation
In a virtualenv (see these instructions if you need to create one):
pip3 install torch-multi-head-attention
Releases
Version | Released | Buster Python 3.7 |
Bullseye Python 3.9 |
Bookworm Python 3.11 |
Files |
---|---|---|---|---|---|
0.15.1 | 2019-02-24 | ||||
0.15.0 | 2019-02-21 | ||||
Issues with this package?
- Search issues for this package
- Package or version missing? Open a new issue
- Something else? Open a new issue