torch-multi-head-attention

Multi-head attention implemented in PyTorch

Installation

In a virtualenv (see these instructions if you need to create one):

pip3 install torch-multi-head-attention

Releases

Version Released Buster
Python 3.7
Bullseye
Python 3.9
Bookworm
Python 3.11
Files
0.15.1 2019-02-24    
0.15.0 2019-02-21    

Issues with this package?

Page last updated 2023-10-28 00:49:09 UTC