2023-08-26T01:09:08,871 Created temporary directory: /tmp/pip-build-tracker-85nou9w7 2023-08-26T01:09:08,872 Initialized build tracking at /tmp/pip-build-tracker-85nou9w7 2023-08-26T01:09:08,872 Created build tracker: /tmp/pip-build-tracker-85nou9w7 2023-08-26T01:09:08,872 Entered build tracker: /tmp/pip-build-tracker-85nou9w7 2023-08-26T01:09:08,873 Created temporary directory: /tmp/pip-wheel-b5_yxxgc 2023-08-26T01:09:08,876 Created temporary directory: /tmp/pip-ephem-wheel-cache-ocem_3e_ 2023-08-26T01:09:08,899 Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple 2023-08-26T01:09:08,903 2 location(s) to search for versions of flash-attention-softmax-n: 2023-08-26T01:09:08,903 * https://pypi.org/simple/flash-attention-softmax-n/ 2023-08-26T01:09:08,903 * https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-08-26T01:09:08,903 Fetching project page and analyzing links: https://pypi.org/simple/flash-attention-softmax-n/ 2023-08-26T01:09:08,904 Getting page https://pypi.org/simple/flash-attention-softmax-n/ 2023-08-26T01:09:08,905 Found index url https://pypi.org/simple/ 2023-08-26T01:09:08,955 Fetched page https://pypi.org/simple/flash-attention-softmax-n/ as application/vnd.pypi.simple.v1+json 2023-08-26T01:09:08,957 Found link https://files.pythonhosted.org/packages/25/2d/a1342b986bf86672e8d50958c2f52b2853f0782a3ae6edb6c2ad39ea6a1b/flash-attention-softmax-n-0.1.0rc6.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.0rc6 2023-08-26T01:09:08,958 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/10/2c/f0ccd8d90a0ce40deff97d3ca681a46d37fcef33cddee004b37bd0e39c45/flash_attention_softmax_n-0.1.0rc6-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-08-26T01:09:08,958 Found link https://files.pythonhosted.org/packages/0e/4f/cc434c22d8d6306db3c5cb3d57bb81936e7e1af6a30c33d16935493ded5b/flash-attention-softmax-n-0.1.0.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.0 2023-08-26T01:09:08,958 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/03/e4/52a47eaff9cce0e4c9d5dcf801a44fd9f2abee05dfba50f8a446eda6265b/flash_attention_softmax_n-0.1.0-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-08-26T01:09:08,959 Found link https://files.pythonhosted.org/packages/eb/40/63a4d7523c8f29a321b8937149345b9506418776c527ede979c309f45f1f/flash-attention-softmax-n-0.1.1.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.1 2023-08-26T01:09:08,959 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/f6/b8/85081626f879c4d5be9fe8dbf3d7a17682779c7e7b6e3bc7160ad6246bff/flash_attention_softmax_n-0.1.1-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-08-26T01:09:08,960 Fetching project page and analyzing links: https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-08-26T01:09:08,960 Getting page https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-08-26T01:09:08,961 Found index url https://www.piwheels.org/simple/ 2023-08-26T01:09:09,111 Fetched page https://www.piwheels.org/simple/flash-attention-softmax-n/ as text/html 2023-08-26T01:09:09,112 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.0-py3-none-any.whl#sha256=730f00afb82c068e4a9456e255e7a2bc68f2f4acb76668c32a6a745b3a784896 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-08-26T01:09:09,113 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.0rc6-py3-none-any.whl#sha256=2cd65986cf280ff10cef44640cfd6f5a7ecead0b80f310cf10b3404bd04953b6 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-08-26T01:09:09,113 Skipping link: not a file: https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-08-26T01:09:09,113 Skipping link: not a file: https://pypi.org/simple/flash-attention-softmax-n/ 2023-08-26T01:09:09,131 Given no hashes to check 1 links for project 'flash-attention-softmax-n': discarding no candidates 2023-08-26T01:09:09,147 Collecting flash-attention-softmax-n==0.1.1 2023-08-26T01:09:09,149 Created temporary directory: /tmp/pip-unpack-smldbkqa 2023-08-26T01:09:09,273 Downloading flash-attention-softmax-n-0.1.1.tar.gz (21 kB) 2023-08-26T01:09:09,328 Added flash-attention-softmax-n==0.1.1 from https://files.pythonhosted.org/packages/eb/40/63a4d7523c8f29a321b8937149345b9506418776c527ede979c309f45f1f/flash-attention-softmax-n-0.1.1.tar.gz to build tracker '/tmp/pip-build-tracker-85nou9w7' 2023-08-26T01:09:09,331 Running setup.py (path:/tmp/pip-wheel-b5_yxxgc/flash-attention-softmax-n_8828402462a7415d91075d4e10ecce1f/setup.py) egg_info for package flash-attention-softmax-n 2023-08-26T01:09:09,332 Created temporary directory: /tmp/pip-pip-egg-info-w2v016r9 2023-08-26T01:09:09,333 Preparing metadata (setup.py): started 2023-08-26T01:09:09,334 Running command python setup.py egg_info 2023-08-26T01:09:09,981 running egg_info 2023-08-26T01:09:09,983 creating /tmp/pip-pip-egg-info-w2v016r9/flash_attention_softmax_n.egg-info 2023-08-26T01:09:10,012 writing /tmp/pip-pip-egg-info-w2v016r9/flash_attention_softmax_n.egg-info/PKG-INFO 2023-08-26T01:09:10,014 writing dependency_links to /tmp/pip-pip-egg-info-w2v016r9/flash_attention_softmax_n.egg-info/dependency_links.txt 2023-08-26T01:09:10,017 writing requirements to /tmp/pip-pip-egg-info-w2v016r9/flash_attention_softmax_n.egg-info/requires.txt 2023-08-26T01:09:10,018 writing top-level names to /tmp/pip-pip-egg-info-w2v016r9/flash_attention_softmax_n.egg-info/top_level.txt 2023-08-26T01:09:10,019 writing manifest file '/tmp/pip-pip-egg-info-w2v016r9/flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-08-26T01:09:10,102 reading manifest file '/tmp/pip-pip-egg-info-w2v016r9/flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-08-26T01:09:10,103 adding license file 'LICENSE' 2023-08-26T01:09:10,105 writing manifest file '/tmp/pip-pip-egg-info-w2v016r9/flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-08-26T01:09:10,166 Preparing metadata (setup.py): finished with status 'done' 2023-08-26T01:09:10,171 Source in /tmp/pip-wheel-b5_yxxgc/flash-attention-softmax-n_8828402462a7415d91075d4e10ecce1f has version 0.1.1, which satisfies requirement flash-attention-softmax-n==0.1.1 from https://files.pythonhosted.org/packages/eb/40/63a4d7523c8f29a321b8937149345b9506418776c527ede979c309f45f1f/flash-attention-softmax-n-0.1.1.tar.gz 2023-08-26T01:09:10,172 Removed flash-attention-softmax-n==0.1.1 from https://files.pythonhosted.org/packages/eb/40/63a4d7523c8f29a321b8937149345b9506418776c527ede979c309f45f1f/flash-attention-softmax-n-0.1.1.tar.gz from build tracker '/tmp/pip-build-tracker-85nou9w7' 2023-08-26T01:09:10,178 Created temporary directory: /tmp/pip-unpack-tzebdphy 2023-08-26T01:09:10,179 Building wheels for collected packages: flash-attention-softmax-n 2023-08-26T01:09:10,182 Created temporary directory: /tmp/pip-wheel-b_xdbzrk 2023-08-26T01:09:10,183 Building wheel for flash-attention-softmax-n (setup.py): started 2023-08-26T01:09:10,184 Destination directory: /tmp/pip-wheel-b_xdbzrk 2023-08-26T01:09:10,184 Running command python setup.py bdist_wheel 2023-08-26T01:09:10,825 running bdist_wheel 2023-08-26T01:09:11,113 running build 2023-08-26T01:09:11,114 running build_py 2023-08-26T01:09:11,143 creating build 2023-08-26T01:09:11,144 creating build/lib 2023-08-26T01:09:11,145 creating build/lib/flash_attention_softmax_n 2023-08-26T01:09:11,146 copying flash_attention_softmax_n/flash_attn_triton.py -> build/lib/flash_attention_softmax_n 2023-08-26T01:09:11,148 copying flash_attention_softmax_n/flash_attn.py -> build/lib/flash_attention_softmax_n 2023-08-26T01:09:11,151 copying flash_attention_softmax_n/functional.py -> build/lib/flash_attention_softmax_n 2023-08-26T01:09:11,152 copying flash_attention_softmax_n/__init__.py -> build/lib/flash_attention_softmax_n 2023-08-26T01:09:11,185 /home/piwheels/.local/lib/python3.9/site-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated. 2023-08-26T01:09:11,186 !! 2023-08-26T01:09:11,186 ******************************************************************************** 2023-08-26T01:09:11,186 Please avoid running ``setup.py`` directly. 2023-08-26T01:09:11,186 Instead, use pypa/build, pypa/installer, pypa/build or 2023-08-26T01:09:11,186 other standards-based tools. 2023-08-26T01:09:11,187 See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. 2023-08-26T01:09:11,187 ******************************************************************************** 2023-08-26T01:09:11,187 !! 2023-08-26T01:09:11,187 self.initialize_options() 2023-08-26T01:09:11,213 installing to build/bdist.linux-armv7l/wheel 2023-08-26T01:09:11,213 running install 2023-08-26T01:09:11,238 running install_lib 2023-08-26T01:09:11,267 creating build/bdist.linux-armv7l 2023-08-26T01:09:11,267 creating build/bdist.linux-armv7l/wheel 2023-08-26T01:09:11,269 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n 2023-08-26T01:09:11,270 copying build/lib/flash_attention_softmax_n/flash_attn_triton.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n 2023-08-26T01:09:11,273 copying build/lib/flash_attention_softmax_n/flash_attn.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n 2023-08-26T01:09:11,275 copying build/lib/flash_attention_softmax_n/functional.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n 2023-08-26T01:09:11,277 copying build/lib/flash_attention_softmax_n/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n 2023-08-26T01:09:11,278 running install_egg_info 2023-08-26T01:09:11,346 running egg_info 2023-08-26T01:09:11,371 writing flash_attention_softmax_n.egg-info/PKG-INFO 2023-08-26T01:09:11,373 writing dependency_links to flash_attention_softmax_n.egg-info/dependency_links.txt 2023-08-26T01:09:11,375 writing requirements to flash_attention_softmax_n.egg-info/requires.txt 2023-08-26T01:09:11,377 writing top-level names to flash_attention_softmax_n.egg-info/top_level.txt 2023-08-26T01:09:11,405 reading manifest file 'flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-08-26T01:09:11,407 adding license file 'LICENSE' 2023-08-26T01:09:11,409 writing manifest file 'flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-08-26T01:09:11,411 Copying flash_attention_softmax_n.egg-info to build/bdist.linux-armv7l/wheel/flash_attention_softmax_n-0.1.1-py3.9.egg-info 2023-08-26T01:09:11,421 running install_scripts 2023-08-26T01:09:11,437 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n-0.1.1.dist-info/WHEEL 2023-08-26T01:09:11,440 creating '/tmp/pip-wheel-b_xdbzrk/flash_attention_softmax_n-0.1.1-py3-none-any.whl' and adding 'build/bdist.linux-armv7l/wheel' to it 2023-08-26T01:09:11,443 adding 'flash_attention_softmax_n/__init__.py' 2023-08-26T01:09:11,444 adding 'flash_attention_softmax_n/flash_attn.py' 2023-08-26T01:09:11,446 adding 'flash_attention_softmax_n/flash_attn_triton.py' 2023-08-26T01:09:11,448 adding 'flash_attention_softmax_n/functional.py' 2023-08-26T01:09:11,455 adding 'flash_attention_softmax_n-0.1.1.dist-info/LICENSE' 2023-08-26T01:09:11,457 adding 'flash_attention_softmax_n-0.1.1.dist-info/METADATA' 2023-08-26T01:09:11,458 adding 'flash_attention_softmax_n-0.1.1.dist-info/WHEEL' 2023-08-26T01:09:11,459 adding 'flash_attention_softmax_n-0.1.1.dist-info/top_level.txt' 2023-08-26T01:09:11,460 adding 'flash_attention_softmax_n-0.1.1.dist-info/RECORD' 2023-08-26T01:09:11,461 removing build/bdist.linux-armv7l/wheel 2023-08-26T01:09:11,547 Building wheel for flash-attention-softmax-n (setup.py): finished with status 'done' 2023-08-26T01:09:11,550 Created wheel for flash-attention-softmax-n: filename=flash_attention_softmax_n-0.1.1-py3-none-any.whl size=21692 sha256=d906038cc35131336cc7a324748ca86291c64e02cfebcb61345c4f951f35201e 2023-08-26T01:09:11,551 Stored in directory: /tmp/pip-ephem-wheel-cache-ocem_3e_/wheels/0c/90/51/96b1ebaf7fb4e771ca0ebfa52acf5c8a202fb35a5cfb7dd779 2023-08-26T01:09:11,563 Successfully built flash-attention-softmax-n 2023-08-26T01:09:11,567 Removed build tracker: '/tmp/pip-build-tracker-85nou9w7'