2023-09-05T16:32:28,789 Created temporary directory: /tmp/pip-build-tracker-cmwgaggy 2023-09-05T16:32:28,791 Initialized build tracking at /tmp/pip-build-tracker-cmwgaggy 2023-09-05T16:32:28,791 Created build tracker: /tmp/pip-build-tracker-cmwgaggy 2023-09-05T16:32:28,791 Entered build tracker: /tmp/pip-build-tracker-cmwgaggy 2023-09-05T16:32:28,791 Created temporary directory: /tmp/pip-wheel-99rg21o4 2023-09-05T16:32:28,795 Created temporary directory: /tmp/pip-ephem-wheel-cache-sl4cluuu 2023-09-05T16:32:28,818 Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple 2023-09-05T16:32:28,821 2 location(s) to search for versions of flash-attention-softmax-n: 2023-09-05T16:32:28,821 * https://pypi.org/simple/flash-attention-softmax-n/ 2023-09-05T16:32:28,821 * https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-09-05T16:32:28,821 Fetching project page and analyzing links: https://pypi.org/simple/flash-attention-softmax-n/ 2023-09-05T16:32:28,822 Getting page https://pypi.org/simple/flash-attention-softmax-n/ 2023-09-05T16:32:28,824 Found index url https://pypi.org/simple/ 2023-09-05T16:32:28,884 Fetched page https://pypi.org/simple/flash-attention-softmax-n/ as application/vnd.pypi.simple.v1+json 2023-09-05T16:32:28,888 Found link https://files.pythonhosted.org/packages/25/2d/a1342b986bf86672e8d50958c2f52b2853f0782a3ae6edb6c2ad39ea6a1b/flash-attention-softmax-n-0.1.0rc6.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.0rc6 2023-09-05T16:32:28,889 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/10/2c/f0ccd8d90a0ce40deff97d3ca681a46d37fcef33cddee004b37bd0e39c45/flash_attention_softmax_n-0.1.0rc6-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:28,889 Found link https://files.pythonhosted.org/packages/0e/4f/cc434c22d8d6306db3c5cb3d57bb81936e7e1af6a30c33d16935493ded5b/flash-attention-softmax-n-0.1.0.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.0 2023-09-05T16:32:28,890 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/03/e4/52a47eaff9cce0e4c9d5dcf801a44fd9f2abee05dfba50f8a446eda6265b/flash_attention_softmax_n-0.1.0-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:28,890 Found link https://files.pythonhosted.org/packages/eb/40/63a4d7523c8f29a321b8937149345b9506418776c527ede979c309f45f1f/flash-attention-softmax-n-0.1.1.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.1 2023-09-05T16:32:28,890 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/f6/b8/85081626f879c4d5be9fe8dbf3d7a17682779c7e7b6e3bc7160ad6246bff/flash_attention_softmax_n-0.1.1-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:28,891 Found link https://files.pythonhosted.org/packages/d0/09/89d6f11f17833d20af896173389adb89799f7350f508118a32ea7957959a/flash-attention-softmax-n-0.1.2.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.2 2023-09-05T16:32:28,891 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/d7/21/3139d7409eb50735b4c5b3d28526352e430a2a2ebd045ecc45bd33293f3e/flash_attention_softmax_n-0.1.2-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:28,892 Found link https://files.pythonhosted.org/packages/ea/22/ccad6b7ffa8e981f73d732fdead7c2f94db7e1c94da9be59661947ee3c72/flash-attention-softmax-n-0.1.3.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.3 2023-09-05T16:32:28,892 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/0e/2b/5d9a20a0cf2ae141033fcaefa6c69e1f0864176228cb45f03ea60ea3f458/flash_attention_softmax_n-0.1.3-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:28,893 Found link https://files.pythonhosted.org/packages/5e/9f/8c2ecc68d2f1a185d9f58e37fbbab2acfca4efd2b3d5d634408e6b546811/flash-attention-softmax-n-0.1.4.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.4 2023-09-05T16:32:28,893 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/fa/0a/9445d4a6183a06054891f20e58588adebd3f9345bb3bf13afd5f2159897f/flash_attention_softmax_n-0.1.4-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:28,893 Found link https://files.pythonhosted.org/packages/fd/49/36da5f1f1d83cf8315da92f8fd7adc055b243814f8c795b39015bf5f7dad/flash-attention-softmax-n-0.2.0.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.2.0 2023-09-05T16:32:28,894 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/8a/ac/0db302116371171c20ba166039827434b1ac9e20e5d0c669213761792052/flash_attention_softmax_n-0.2.0-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:28,894 Found link https://files.pythonhosted.org/packages/5b/bc/90fdc74a3e8dc1c0d1911b7f7a0cffc94e86c9a5ce5cf652d8b13477a34a/flash-attention-softmax-n-0.2.1.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.2.1 2023-09-05T16:32:28,894 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/fd/b8/e381de523f3927c56c43011e0b2f5a2f39bd50012087a2b35d12278315c0/flash_attention_softmax_n-0.2.1-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:28,895 Found link https://files.pythonhosted.org/packages/97/1d/c1a27b13be8ddaf60daf86d26e639a314004563c4d8bf4702b50cd1255b4/flash-attention-softmax-n-0.3.0.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.3.0 2023-09-05T16:32:28,895 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/ba/49/924b5963fcae64e70221f5096659be812ee088d885026cc506036219320c/flash_attention_softmax_n-0.3.0-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:28,896 Fetching project page and analyzing links: https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-09-05T16:32:28,896 Getting page https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-09-05T16:32:28,897 Found index url https://www.piwheels.org/simple/ 2023-09-05T16:32:29,110 Fetched page https://www.piwheels.org/simple/flash-attention-softmax-n/ as text/html 2023-09-05T16:32:29,113 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.2.1-py3-none-any.whl#sha256=e41d35be2f3fea375a98acf80f6f4eb013e55c39b9ba231c4c0a12d046610755 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:29,114 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.2.0-py3-none-any.whl#sha256=5a2688132ed69a84e44af1726e418a63020dcbbc685982ac369747bcd9e8808c (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:29,114 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.4-py3-none-any.whl#sha256=16f99402e918bba0cc9a6666cd3c06693ea0779f5d9d890d469fa18c466e4fce (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:29,114 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.3-py3-none-any.whl#sha256=330330ff2c854d62599ba39620220bdd55b187d6680e9027ed0486cc1077cf04 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:29,115 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.2-py3-none-any.whl#sha256=54a16bb63bb01235b3b4b49d98b7e1c20b6108f3ba638e8928e999a1a631e676 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:29,115 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.1-py3-none-any.whl#sha256=d906038cc35131336cc7a324748ca86291c64e02cfebcb61345c4f951f35201e (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:29,115 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.0-py3-none-any.whl#sha256=730f00afb82c068e4a9456e255e7a2bc68f2f4acb76668c32a6a745b3a784896 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:29,115 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.0rc6-py3-none-any.whl#sha256=2cd65986cf280ff10cef44640cfd6f5a7ecead0b80f310cf10b3404bd04953b6 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-09-05T16:32:29,115 Skipping link: not a file: https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-09-05T16:32:29,116 Skipping link: not a file: https://pypi.org/simple/flash-attention-softmax-n/ 2023-09-05T16:32:29,134 Given no hashes to check 1 links for project 'flash-attention-softmax-n': discarding no candidates 2023-09-05T16:32:29,150 Collecting flash-attention-softmax-n==0.3.0 2023-09-05T16:32:29,153 Created temporary directory: /tmp/pip-unpack-r014n9sx 2023-09-05T16:32:29,294 Downloading flash-attention-softmax-n-0.3.0.tar.gz (31 kB) 2023-09-05T16:32:29,361 Added flash-attention-softmax-n==0.3.0 from https://files.pythonhosted.org/packages/97/1d/c1a27b13be8ddaf60daf86d26e639a314004563c4d8bf4702b50cd1255b4/flash-attention-softmax-n-0.3.0.tar.gz to build tracker '/tmp/pip-build-tracker-cmwgaggy' 2023-09-05T16:32:29,363 Running setup.py (path:/tmp/pip-wheel-99rg21o4/flash-attention-softmax-n_a811c3ac3f6f4ee5a3f2cb2e0813e88e/setup.py) egg_info for package flash-attention-softmax-n 2023-09-05T16:32:29,364 Created temporary directory: /tmp/pip-pip-egg-info-rajj8aye 2023-09-05T16:32:29,364 Preparing metadata (setup.py): started 2023-09-05T16:32:29,365 Running command python setup.py egg_info 2023-09-05T16:32:29,987 running egg_info 2023-09-05T16:32:29,989 creating /tmp/pip-pip-egg-info-rajj8aye/flash_attention_softmax_n.egg-info 2023-09-05T16:32:30,017 writing /tmp/pip-pip-egg-info-rajj8aye/flash_attention_softmax_n.egg-info/PKG-INFO 2023-09-05T16:32:30,020 writing dependency_links to /tmp/pip-pip-egg-info-rajj8aye/flash_attention_softmax_n.egg-info/dependency_links.txt 2023-09-05T16:32:30,022 writing requirements to /tmp/pip-pip-egg-info-rajj8aye/flash_attention_softmax_n.egg-info/requires.txt 2023-09-05T16:32:30,024 writing top-level names to /tmp/pip-pip-egg-info-rajj8aye/flash_attention_softmax_n.egg-info/top_level.txt 2023-09-05T16:32:30,025 writing manifest file '/tmp/pip-pip-egg-info-rajj8aye/flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-09-05T16:32:30,107 reading manifest file '/tmp/pip-pip-egg-info-rajj8aye/flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-09-05T16:32:30,109 adding license file 'LICENSE' 2023-09-05T16:32:30,111 writing manifest file '/tmp/pip-pip-egg-info-rajj8aye/flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-09-05T16:32:30,171 Preparing metadata (setup.py): finished with status 'done' 2023-09-05T16:32:30,178 Source in /tmp/pip-wheel-99rg21o4/flash-attention-softmax-n_a811c3ac3f6f4ee5a3f2cb2e0813e88e has version 0.3.0, which satisfies requirement flash-attention-softmax-n==0.3.0 from https://files.pythonhosted.org/packages/97/1d/c1a27b13be8ddaf60daf86d26e639a314004563c4d8bf4702b50cd1255b4/flash-attention-softmax-n-0.3.0.tar.gz 2023-09-05T16:32:30,179 Removed flash-attention-softmax-n==0.3.0 from https://files.pythonhosted.org/packages/97/1d/c1a27b13be8ddaf60daf86d26e639a314004563c4d8bf4702b50cd1255b4/flash-attention-softmax-n-0.3.0.tar.gz from build tracker '/tmp/pip-build-tracker-cmwgaggy' 2023-09-05T16:32:30,186 Created temporary directory: /tmp/pip-unpack-rrntwo7p 2023-09-05T16:32:30,186 Building wheels for collected packages: flash-attention-softmax-n 2023-09-05T16:32:30,191 Created temporary directory: /tmp/pip-wheel-qswcuzku 2023-09-05T16:32:30,191 Building wheel for flash-attention-softmax-n (setup.py): started 2023-09-05T16:32:30,192 Destination directory: /tmp/pip-wheel-qswcuzku 2023-09-05T16:32:30,192 Running command python setup.py bdist_wheel 2023-09-05T16:32:30,865 running bdist_wheel 2023-09-05T16:32:31,158 running build 2023-09-05T16:32:31,158 running build_py 2023-09-05T16:32:31,188 creating build 2023-09-05T16:32:31,188 creating build/lib 2023-09-05T16:32:31,189 creating build/lib/flash_attention_softmax_n 2023-09-05T16:32:31,191 copying flash_attention_softmax_n/__init__.py -> build/lib/flash_attention_softmax_n 2023-09-05T16:32:31,193 creating build/lib/flash_attention_softmax_n/analysis 2023-09-05T16:32:31,194 copying flash_attention_softmax_n/analysis/statistics.py -> build/lib/flash_attention_softmax_n/analysis 2023-09-05T16:32:31,197 copying flash_attention_softmax_n/analysis/hooks.py -> build/lib/flash_attention_softmax_n/analysis 2023-09-05T16:32:31,199 copying flash_attention_softmax_n/analysis/io.py -> build/lib/flash_attention_softmax_n/analysis 2023-09-05T16:32:31,200 copying flash_attention_softmax_n/analysis/__init__.py -> build/lib/flash_attention_softmax_n/analysis 2023-09-05T16:32:31,203 creating build/lib/flash_attention_softmax_n/surgery 2023-09-05T16:32:31,204 copying flash_attention_softmax_n/surgery/attention_softmax_n.py -> build/lib/flash_attention_softmax_n/surgery 2023-09-05T16:32:31,206 copying flash_attention_softmax_n/surgery/__init__.py -> build/lib/flash_attention_softmax_n/surgery 2023-09-05T16:32:31,208 creating build/lib/flash_attention_softmax_n/core 2023-09-05T16:32:31,209 copying flash_attention_softmax_n/core/functional.py -> build/lib/flash_attention_softmax_n/core 2023-09-05T16:32:31,211 copying flash_attention_softmax_n/core/flash_attn.py -> build/lib/flash_attention_softmax_n/core 2023-09-05T16:32:31,214 copying flash_attention_softmax_n/core/__init__.py -> build/lib/flash_attention_softmax_n/core 2023-09-05T16:32:31,215 copying flash_attention_softmax_n/core/flash_attn_triton.py -> build/lib/flash_attention_softmax_n/core 2023-09-05T16:32:31,218 creating build/lib/flash_attention_softmax_n/surgery/surgery_functions 2023-09-05T16:32:31,219 copying flash_attention_softmax_n/surgery/surgery_functions/_bert.py -> build/lib/flash_attention_softmax_n/surgery/surgery_functions 2023-09-05T16:32:31,222 copying flash_attention_softmax_n/surgery/surgery_functions/__init__.py -> build/lib/flash_attention_softmax_n/surgery/surgery_functions 2023-09-05T16:32:31,223 copying flash_attention_softmax_n/surgery/surgery_functions/utils.py -> build/lib/flash_attention_softmax_n/surgery/surgery_functions 2023-09-05T16:32:31,259 /home/piwheels/.local/lib/python3.9/site-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated. 2023-09-05T16:32:31,259 !! 2023-09-05T16:32:31,260 ******************************************************************************** 2023-09-05T16:32:31,260 Please avoid running ``setup.py`` directly. 2023-09-05T16:32:31,260 Instead, use pypa/build, pypa/installer, pypa/build or 2023-09-05T16:32:31,260 other standards-based tools. 2023-09-05T16:32:31,260 See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. 2023-09-05T16:32:31,260 ******************************************************************************** 2023-09-05T16:32:31,261 !! 2023-09-05T16:32:31,261 self.initialize_options() 2023-09-05T16:32:31,287 installing to build/bdist.linux-armv7l/wheel 2023-09-05T16:32:31,288 running install 2023-09-05T16:32:31,312 running install_lib 2023-09-05T16:32:31,340 creating build/bdist.linux-armv7l 2023-09-05T16:32:31,341 creating build/bdist.linux-armv7l/wheel 2023-09-05T16:32:31,343 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n 2023-09-05T16:32:31,344 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/analysis 2023-09-05T16:32:31,345 copying build/lib/flash_attention_softmax_n/analysis/statistics.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/analysis 2023-09-05T16:32:31,347 copying build/lib/flash_attention_softmax_n/analysis/hooks.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/analysis 2023-09-05T16:32:31,349 copying build/lib/flash_attention_softmax_n/analysis/io.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/analysis 2023-09-05T16:32:31,351 copying build/lib/flash_attention_softmax_n/analysis/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/analysis 2023-09-05T16:32:31,354 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery 2023-09-05T16:32:31,355 copying build/lib/flash_attention_softmax_n/surgery/attention_softmax_n.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery 2023-09-05T16:32:31,357 copying build/lib/flash_attention_softmax_n/surgery/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery 2023-09-05T16:32:31,359 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery/surgery_functions 2023-09-05T16:32:31,360 copying build/lib/flash_attention_softmax_n/surgery/surgery_functions/_bert.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery/surgery_functions 2023-09-05T16:32:31,362 copying build/lib/flash_attention_softmax_n/surgery/surgery_functions/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery/surgery_functions 2023-09-05T16:32:31,364 copying build/lib/flash_attention_softmax_n/surgery/surgery_functions/utils.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery/surgery_functions 2023-09-05T16:32:31,366 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/core 2023-09-05T16:32:31,367 copying build/lib/flash_attention_softmax_n/core/functional.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/core 2023-09-05T16:32:31,369 copying build/lib/flash_attention_softmax_n/core/flash_attn.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/core 2023-09-05T16:32:31,372 copying build/lib/flash_attention_softmax_n/core/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/core 2023-09-05T16:32:31,373 copying build/lib/flash_attention_softmax_n/core/flash_attn_triton.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/core 2023-09-05T16:32:31,375 copying build/lib/flash_attention_softmax_n/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n 2023-09-05T16:32:31,377 running install_egg_info 2023-09-05T16:32:31,442 running egg_info 2023-09-05T16:32:31,467 writing flash_attention_softmax_n.egg-info/PKG-INFO 2023-09-05T16:32:31,469 writing dependency_links to flash_attention_softmax_n.egg-info/dependency_links.txt 2023-09-05T16:32:31,471 writing requirements to flash_attention_softmax_n.egg-info/requires.txt 2023-09-05T16:32:31,473 writing top-level names to flash_attention_softmax_n.egg-info/top_level.txt 2023-09-05T16:32:31,501 reading manifest file 'flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-09-05T16:32:31,503 adding license file 'LICENSE' 2023-09-05T16:32:31,506 writing manifest file 'flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-09-05T16:32:31,508 Copying flash_attention_softmax_n.egg-info to build/bdist.linux-armv7l/wheel/flash_attention_softmax_n-0.3.0-py3.9.egg-info 2023-09-05T16:32:31,517 running install_scripts 2023-09-05T16:32:31,534 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n-0.3.0.dist-info/WHEEL 2023-09-05T16:32:31,537 creating '/tmp/pip-wheel-qswcuzku/flash_attention_softmax_n-0.3.0-py3-none-any.whl' and adding 'build/bdist.linux-armv7l/wheel' to it 2023-09-05T16:32:31,539 adding 'flash_attention_softmax_n/__init__.py' 2023-09-05T16:32:31,540 adding 'flash_attention_softmax_n/analysis/__init__.py' 2023-09-05T16:32:31,542 adding 'flash_attention_softmax_n/analysis/hooks.py' 2023-09-05T16:32:31,543 adding 'flash_attention_softmax_n/analysis/io.py' 2023-09-05T16:32:31,544 adding 'flash_attention_softmax_n/analysis/statistics.py' 2023-09-05T16:32:31,546 adding 'flash_attention_softmax_n/core/__init__.py' 2023-09-05T16:32:31,547 adding 'flash_attention_softmax_n/core/flash_attn.py' 2023-09-05T16:32:31,550 adding 'flash_attention_softmax_n/core/flash_attn_triton.py' 2023-09-05T16:32:31,551 adding 'flash_attention_softmax_n/core/functional.py' 2023-09-05T16:32:31,553 adding 'flash_attention_softmax_n/surgery/__init__.py' 2023-09-05T16:32:31,554 adding 'flash_attention_softmax_n/surgery/attention_softmax_n.py' 2023-09-05T16:32:31,556 adding 'flash_attention_softmax_n/surgery/surgery_functions/__init__.py' 2023-09-05T16:32:31,557 adding 'flash_attention_softmax_n/surgery/surgery_functions/_bert.py' 2023-09-05T16:32:31,559 adding 'flash_attention_softmax_n/surgery/surgery_functions/utils.py' 2023-09-05T16:32:31,566 adding 'flash_attention_softmax_n-0.3.0.dist-info/LICENSE' 2023-09-05T16:32:31,568 adding 'flash_attention_softmax_n-0.3.0.dist-info/METADATA' 2023-09-05T16:32:31,569 adding 'flash_attention_softmax_n-0.3.0.dist-info/WHEEL' 2023-09-05T16:32:31,570 adding 'flash_attention_softmax_n-0.3.0.dist-info/top_level.txt' 2023-09-05T16:32:31,571 adding 'flash_attention_softmax_n-0.3.0.dist-info/RECORD' 2023-09-05T16:32:31,573 removing build/bdist.linux-armv7l/wheel 2023-09-05T16:32:31,661 Building wheel for flash-attention-softmax-n (setup.py): finished with status 'done' 2023-09-05T16:32:31,664 Created wheel for flash-attention-softmax-n: filename=flash_attention_softmax_n-0.3.0-py3-none-any.whl size=33446 sha256=021016968ea910c2d56f23ff965a97fa4fea5dc97f37a5271a5d3dbd3cdf1a87 2023-09-05T16:32:31,665 Stored in directory: /tmp/pip-ephem-wheel-cache-sl4cluuu/wheels/12/06/1e/33cc88e95d32d3bf8053af074f18443c4d8f59d293a4b96fa5 2023-09-05T16:32:31,679 Successfully built flash-attention-softmax-n 2023-09-05T16:32:31,683 Removed build tracker: '/tmp/pip-build-tracker-cmwgaggy'