Created temporary directory: /tmp/pip-ephem-wheel-cache-8u3jt4_c Created temporary directory: /tmp/pip-req-tracker-fdd1wjv8 Created requirements tracker '/tmp/pip-req-tracker-fdd1wjv8' Created temporary directory: /tmp/pip-wheel-794xllpe Collecting torch-multi-head-attention==0.15.1 1 location(s) to search for versions of torch-multi-head-attention: * https://pypi.org/simple/torch-multi-head-attention/ Getting page https://pypi.org/simple/torch-multi-head-attention/ Analyzing links from page https://pypi.org/simple/torch-multi-head-attention/ Found link https://files.pythonhosted.org/packages/fe/c6/6bf35cf1292e4a7756c534fa4373f539a946904435338e5b1d0a6ae444a6/torch-multi-head-attention-0.15.0.tar.gz#sha256=7e51ea6a54b4ee16134c00ea8930e444757f326951ba9cde709410e7c11c9fe1 (from https://pypi.org/simple/torch-multi-head-attention/), version: 0.15.0 Found link https://files.pythonhosted.org/packages/8d/73/b0734654ec4c950270d32c3d4ffb7460e63df229021a52386bf86356e815/torch-multi-head-attention-0.15.1.tar.gz#sha256=e181602fe1ef6da8322cb6bc1ffb41f52d3658c54e3937040e8f186754bb3056 (from https://pypi.org/simple/torch-multi-head-attention/), version: 0.15.1 Using version 0.15.1 (newest of versions: 0.15.1) Created temporary directory: /tmp/pip-unpack-f22enlne Downloading https://files.pythonhosted.org/packages/8d/73/b0734654ec4c950270d32c3d4ffb7460e63df229021a52386bf86356e815/torch-multi-head-attention-0.15.1.tar.gz Downloading from URL https://files.pythonhosted.org/packages/8d/73/b0734654ec4c950270d32c3d4ffb7460e63df229021a52386bf86356e815/torch-multi-head-attention-0.15.1.tar.gz#sha256=e181602fe1ef6da8322cb6bc1ffb41f52d3658c54e3937040e8f186754bb3056 (from https://pypi.org/simple/torch-multi-head-attention/) Added torch-multi-head-attention==0.15.1 from https://files.pythonhosted.org/packages/8d/73/b0734654ec4c950270d32c3d4ffb7460e63df229021a52386bf86356e815/torch-multi-head-attention-0.15.1.tar.gz#sha256=e181602fe1ef6da8322cb6bc1ffb41f52d3658c54e3937040e8f186754bb3056 to build tracker '/tmp/pip-req-tracker-fdd1wjv8' Running setup.py (path:/tmp/pip-wheel-794xllpe/torch-multi-head-attention/setup.py) egg_info for package torch-multi-head-attention Running command python setup.py egg_info running egg_info creating pip-egg-info/torch_multi_head_attention.egg-info writing pip-egg-info/torch_multi_head_attention.egg-info/PKG-INFO writing requirements to pip-egg-info/torch_multi_head_attention.egg-info/requires.txt writing top-level names to pip-egg-info/torch_multi_head_attention.egg-info/top_level.txt writing dependency_links to pip-egg-info/torch_multi_head_attention.egg-info/dependency_links.txt writing manifest file 'pip-egg-info/torch_multi_head_attention.egg-info/SOURCES.txt' reading manifest file 'pip-egg-info/torch_multi_head_attention.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' writing manifest file 'pip-egg-info/torch_multi_head_attention.egg-info/SOURCES.txt' Source in /tmp/pip-wheel-794xllpe/torch-multi-head-attention has version 0.15.1, which satisfies requirement torch-multi-head-attention==0.15.1 from https://files.pythonhosted.org/packages/8d/73/b0734654ec4c950270d32c3d4ffb7460e63df229021a52386bf86356e815/torch-multi-head-attention-0.15.1.tar.gz#sha256=e181602fe1ef6da8322cb6bc1ffb41f52d3658c54e3937040e8f186754bb3056 Removed torch-multi-head-attention==0.15.1 from https://files.pythonhosted.org/packages/8d/73/b0734654ec4c950270d32c3d4ffb7460e63df229021a52386bf86356e815/torch-multi-head-attention-0.15.1.tar.gz#sha256=e181602fe1ef6da8322cb6bc1ffb41f52d3658c54e3937040e8f186754bb3056 from build tracker '/tmp/pip-req-tracker-fdd1wjv8' Building wheels for collected packages: torch-multi-head-attention Created temporary directory: /tmp/pip-wheel-p0v5ujdy Running setup.py bdist_wheel for torch-multi-head-attention: started Destination directory: /tmp/pip-wheel-p0v5ujdy Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-wheel-794xllpe/torch-multi-head-attention/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-p0v5ujdy running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/torch_multi_head_attention copying torch_multi_head_attention/__init__.py -> build/lib/torch_multi_head_attention copying torch_multi_head_attention/multi_head_attention.py -> build/lib/torch_multi_head_attention creating build/lib/tests copying tests/test_scaled_dot_product_attention.py -> build/lib/tests copying tests/__init__.py -> build/lib/tests copying tests/test_multi_head_attention.py -> build/lib/tests installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/torch_multi_head_attention copying build/lib/torch_multi_head_attention/__init__.py -> build/bdist.linux-armv7l/wheel/torch_multi_head_attention copying build/lib/torch_multi_head_attention/multi_head_attention.py -> build/bdist.linux-armv7l/wheel/torch_multi_head_attention creating build/bdist.linux-armv7l/wheel/tests copying build/lib/tests/test_scaled_dot_product_attention.py -> build/bdist.linux-armv7l/wheel/tests copying build/lib/tests/__init__.py -> build/bdist.linux-armv7l/wheel/tests copying build/lib/tests/test_multi_head_attention.py -> build/bdist.linux-armv7l/wheel/tests running install_egg_info running egg_info writing torch_multi_head_attention.egg-info/PKG-INFO writing requirements to torch_multi_head_attention.egg-info/requires.txt writing dependency_links to torch_multi_head_attention.egg-info/dependency_links.txt writing top-level names to torch_multi_head_attention.egg-info/top_level.txt reading manifest file 'torch_multi_head_attention.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' writing manifest file 'torch_multi_head_attention.egg-info/SOURCES.txt' Copying torch_multi_head_attention.egg-info to build/bdist.linux-armv7l/wheel/torch_multi_head_attention-0.15.1-py3.4.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/torch_multi_head_attention-0.15.1.dist-info/WHEEL creating '/tmp/pip-wheel-p0v5ujdy/torch_multi_head_attention-0.15.1-py3-none-any.whl' and adding 'build/bdist.linux-armv7l/wheel' to it adding 'tests/__init__.py' adding 'tests/test_multi_head_attention.py' adding 'tests/test_scaled_dot_product_attention.py' adding 'torch_multi_head_attention/__init__.py' adding 'torch_multi_head_attention/multi_head_attention.py' adding 'torch_multi_head_attention-0.15.1.dist-info/METADATA' adding 'torch_multi_head_attention-0.15.1.dist-info/WHEEL' adding 'torch_multi_head_attention-0.15.1.dist-info/top_level.txt' adding 'torch_multi_head_attention-0.15.1.dist-info/RECORD' removing build/bdist.linux-armv7l/wheel Running setup.py bdist_wheel for torch-multi-head-attention: finished with status 'done' Stored in directory: /tmp/tmp_pcokkf4 Successfully built torch-multi-head-attention Cleaning up... Removing source in /tmp/pip-wheel-794xllpe/torch-multi-head-attention Removed build tracker '/tmp/pip-req-tracker-fdd1wjv8'