Created temporary directory: /tmp/pip-ephem-wheel-cache-whu9eoxn Created temporary directory: /tmp/pip-req-tracker-qwronumf Created requirements tracker '/tmp/pip-req-tracker-qwronumf' Created temporary directory: /tmp/pip-wheel-fg3mmhj4 Collecting torch-multi-head-attention==0.15.0 1 location(s) to search for versions of torch-multi-head-attention: * https://pypi.org/simple/torch-multi-head-attention/ Getting page https://pypi.org/simple/torch-multi-head-attention/ Analyzing links from page https://pypi.org/simple/torch-multi-head-attention/ Found link https://files.pythonhosted.org/packages/fe/c6/6bf35cf1292e4a7756c534fa4373f539a946904435338e5b1d0a6ae444a6/torch-multi-head-attention-0.15.0.tar.gz#sha256=7e51ea6a54b4ee16134c00ea8930e444757f326951ba9cde709410e7c11c9fe1 (from https://pypi.org/simple/torch-multi-head-attention/), version: 0.15.0 Using version 0.15.0 (newest of versions: 0.15.0) Created temporary directory: /tmp/pip-unpack-ojfw99sa Downloading https://files.pythonhosted.org/packages/fe/c6/6bf35cf1292e4a7756c534fa4373f539a946904435338e5b1d0a6ae444a6/torch-multi-head-attention-0.15.0.tar.gz Downloading from URL https://files.pythonhosted.org/packages/fe/c6/6bf35cf1292e4a7756c534fa4373f539a946904435338e5b1d0a6ae444a6/torch-multi-head-attention-0.15.0.tar.gz#sha256=7e51ea6a54b4ee16134c00ea8930e444757f326951ba9cde709410e7c11c9fe1 (from https://pypi.org/simple/torch-multi-head-attention/) Added torch-multi-head-attention==0.15.0 from https://files.pythonhosted.org/packages/fe/c6/6bf35cf1292e4a7756c534fa4373f539a946904435338e5b1d0a6ae444a6/torch-multi-head-attention-0.15.0.tar.gz#sha256=7e51ea6a54b4ee16134c00ea8930e444757f326951ba9cde709410e7c11c9fe1 to build tracker '/tmp/pip-req-tracker-qwronumf' Running setup.py (path:/tmp/pip-wheel-fg3mmhj4/torch-multi-head-attention/setup.py) egg_info for package torch-multi-head-attention Running command python setup.py egg_info running egg_info creating pip-egg-info/torch_multi_head_attention.egg-info writing requirements to pip-egg-info/torch_multi_head_attention.egg-info/requires.txt writing top-level names to pip-egg-info/torch_multi_head_attention.egg-info/top_level.txt writing dependency_links to pip-egg-info/torch_multi_head_attention.egg-info/dependency_links.txt writing pip-egg-info/torch_multi_head_attention.egg-info/PKG-INFO writing manifest file 'pip-egg-info/torch_multi_head_attention.egg-info/SOURCES.txt' reading manifest file 'pip-egg-info/torch_multi_head_attention.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' writing manifest file 'pip-egg-info/torch_multi_head_attention.egg-info/SOURCES.txt' Source in /tmp/pip-wheel-fg3mmhj4/torch-multi-head-attention has version 0.15.0, which satisfies requirement torch-multi-head-attention==0.15.0 from https://files.pythonhosted.org/packages/fe/c6/6bf35cf1292e4a7756c534fa4373f539a946904435338e5b1d0a6ae444a6/torch-multi-head-attention-0.15.0.tar.gz#sha256=7e51ea6a54b4ee16134c00ea8930e444757f326951ba9cde709410e7c11c9fe1 Removed torch-multi-head-attention==0.15.0 from https://files.pythonhosted.org/packages/fe/c6/6bf35cf1292e4a7756c534fa4373f539a946904435338e5b1d0a6ae444a6/torch-multi-head-attention-0.15.0.tar.gz#sha256=7e51ea6a54b4ee16134c00ea8930e444757f326951ba9cde709410e7c11c9fe1 from build tracker '/tmp/pip-req-tracker-qwronumf' Building wheels for collected packages: torch-multi-head-attention Created temporary directory: /tmp/pip-wheel-wqnwhtly Running setup.py bdist_wheel for torch-multi-head-attention: started Destination directory: /tmp/pip-wheel-wqnwhtly Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-wheel-fg3mmhj4/torch-multi-head-attention/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-wqnwhtly running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/torch_multi_head_attention copying torch_multi_head_attention/__init__.py -> build/lib/torch_multi_head_attention copying torch_multi_head_attention/multi_head_attention.py -> build/lib/torch_multi_head_attention creating build/lib/tests copying tests/test_scaled_dot_product_attention.py -> build/lib/tests copying tests/__init__.py -> build/lib/tests copying tests/test_multi_head_attention.py -> build/lib/tests installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/torch_multi_head_attention copying build/lib/torch_multi_head_attention/__init__.py -> build/bdist.linux-armv7l/wheel/torch_multi_head_attention copying build/lib/torch_multi_head_attention/multi_head_attention.py -> build/bdist.linux-armv7l/wheel/torch_multi_head_attention creating build/bdist.linux-armv7l/wheel/tests copying build/lib/tests/test_scaled_dot_product_attention.py -> build/bdist.linux-armv7l/wheel/tests copying build/lib/tests/__init__.py -> build/bdist.linux-armv7l/wheel/tests copying build/lib/tests/test_multi_head_attention.py -> build/bdist.linux-armv7l/wheel/tests running install_egg_info running egg_info writing requirements to torch_multi_head_attention.egg-info/requires.txt writing torch_multi_head_attention.egg-info/PKG-INFO writing top-level names to torch_multi_head_attention.egg-info/top_level.txt writing dependency_links to torch_multi_head_attention.egg-info/dependency_links.txt reading manifest file 'torch_multi_head_attention.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' writing manifest file 'torch_multi_head_attention.egg-info/SOURCES.txt' Copying torch_multi_head_attention.egg-info to build/bdist.linux-armv7l/wheel/torch_multi_head_attention-0.15.0-py3.4.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/torch_multi_head_attention-0.15.0.dist-info/WHEEL creating '/tmp/pip-wheel-wqnwhtly/torch_multi_head_attention-0.15.0-py3-none-any.whl' and adding 'build/bdist.linux-armv7l/wheel' to it adding 'tests/__init__.py' adding 'tests/test_multi_head_attention.py' adding 'tests/test_scaled_dot_product_attention.py' adding 'torch_multi_head_attention/__init__.py' adding 'torch_multi_head_attention/multi_head_attention.py' adding 'torch_multi_head_attention-0.15.0.dist-info/METADATA' adding 'torch_multi_head_attention-0.15.0.dist-info/WHEEL' adding 'torch_multi_head_attention-0.15.0.dist-info/top_level.txt' adding 'torch_multi_head_attention-0.15.0.dist-info/RECORD' removing build/bdist.linux-armv7l/wheel Running setup.py bdist_wheel for torch-multi-head-attention: finished with status 'done' Stored in directory: /tmp/tmpbofdt9w3 Successfully built torch-multi-head-attention Cleaning up... Removing source in /tmp/pip-wheel-fg3mmhj4/torch-multi-head-attention Removed build tracker '/tmp/pip-req-tracker-qwronumf'