2023-11-25T06:23:08,061 Created temporary directory: /tmp/pip-build-tracker-de_ayto_ 2023-11-25T06:23:08,064 Initialized build tracking at /tmp/pip-build-tracker-de_ayto_ 2023-11-25T06:23:08,065 Created build tracker: /tmp/pip-build-tracker-de_ayto_ 2023-11-25T06:23:08,066 Entered build tracker: /tmp/pip-build-tracker-de_ayto_ 2023-11-25T06:23:08,068 Created temporary directory: /tmp/pip-wheel-v8evf89s 2023-11-25T06:23:08,076 Created temporary directory: /tmp/pip-ephem-wheel-cache-ktl8mf01 2023-11-25T06:23:08,135 Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple 2023-11-25T06:23:08,140 2 location(s) to search for versions of flash-attention-softmax-n: 2023-11-25T06:23:08,140 * https://pypi.org/simple/flash-attention-softmax-n/ 2023-11-25T06:23:08,140 * https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-11-25T06:23:08,141 Fetching project page and analyzing links: https://pypi.org/simple/flash-attention-softmax-n/ 2023-11-25T06:23:08,143 Getting page https://pypi.org/simple/flash-attention-softmax-n/ 2023-11-25T06:23:08,145 Found index url https://pypi.org/simple/ 2023-11-25T06:23:08,395 Fetched page https://pypi.org/simple/flash-attention-softmax-n/ as application/vnd.pypi.simple.v1+json 2023-11-25T06:23:08,400 Found link https://files.pythonhosted.org/packages/25/2d/a1342b986bf86672e8d50958c2f52b2853f0782a3ae6edb6c2ad39ea6a1b/flash-attention-softmax-n-0.1.0rc6.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.0rc6 2023-11-25T06:23:08,401 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/10/2c/f0ccd8d90a0ce40deff97d3ca681a46d37fcef33cddee004b37bd0e39c45/flash_attention_softmax_n-0.1.0rc6-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,401 Found link https://files.pythonhosted.org/packages/0e/4f/cc434c22d8d6306db3c5cb3d57bb81936e7e1af6a30c33d16935493ded5b/flash-attention-softmax-n-0.1.0.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.0 2023-11-25T06:23:08,402 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/03/e4/52a47eaff9cce0e4c9d5dcf801a44fd9f2abee05dfba50f8a446eda6265b/flash_attention_softmax_n-0.1.0-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,403 Found link https://files.pythonhosted.org/packages/eb/40/63a4d7523c8f29a321b8937149345b9506418776c527ede979c309f45f1f/flash-attention-softmax-n-0.1.1.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.1 2023-11-25T06:23:08,404 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/f6/b8/85081626f879c4d5be9fe8dbf3d7a17682779c7e7b6e3bc7160ad6246bff/flash_attention_softmax_n-0.1.1-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,405 Found link https://files.pythonhosted.org/packages/d0/09/89d6f11f17833d20af896173389adb89799f7350f508118a32ea7957959a/flash-attention-softmax-n-0.1.2.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.2 2023-11-25T06:23:08,405 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/d7/21/3139d7409eb50735b4c5b3d28526352e430a2a2ebd045ecc45bd33293f3e/flash_attention_softmax_n-0.1.2-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,406 Found link https://files.pythonhosted.org/packages/ea/22/ccad6b7ffa8e981f73d732fdead7c2f94db7e1c94da9be59661947ee3c72/flash-attention-softmax-n-0.1.3.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.3 2023-11-25T06:23:08,407 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/0e/2b/5d9a20a0cf2ae141033fcaefa6c69e1f0864176228cb45f03ea60ea3f458/flash_attention_softmax_n-0.1.3-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,408 Found link https://files.pythonhosted.org/packages/5e/9f/8c2ecc68d2f1a185d9f58e37fbbab2acfca4efd2b3d5d634408e6b546811/flash-attention-softmax-n-0.1.4.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.1.4 2023-11-25T06:23:08,409 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/fa/0a/9445d4a6183a06054891f20e58588adebd3f9345bb3bf13afd5f2159897f/flash_attention_softmax_n-0.1.4-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,410 Found link https://files.pythonhosted.org/packages/fd/49/36da5f1f1d83cf8315da92f8fd7adc055b243814f8c795b39015bf5f7dad/flash-attention-softmax-n-0.2.0.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.2.0 2023-11-25T06:23:08,410 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/8a/ac/0db302116371171c20ba166039827434b1ac9e20e5d0c669213761792052/flash_attention_softmax_n-0.2.0-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,411 Found link https://files.pythonhosted.org/packages/5b/bc/90fdc74a3e8dc1c0d1911b7f7a0cffc94e86c9a5ce5cf652d8b13477a34a/flash-attention-softmax-n-0.2.1.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.2.1 2023-11-25T06:23:08,412 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/fd/b8/e381de523f3927c56c43011e0b2f5a2f39bd50012087a2b35d12278315c0/flash_attention_softmax_n-0.2.1-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,413 Found link https://files.pythonhosted.org/packages/97/1d/c1a27b13be8ddaf60daf86d26e639a314004563c4d8bf4702b50cd1255b4/flash-attention-softmax-n-0.3.0.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.3.0 2023-11-25T06:23:08,413 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/ba/49/924b5963fcae64e70221f5096659be812ee088d885026cc506036219320c/flash_attention_softmax_n-0.3.0-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,414 Found link https://files.pythonhosted.org/packages/d2/48/b8cda31048d822604c18ecb469716f379fdde3b2a4ea8765cecab86f10ea/flash-attention-softmax-n-0.3.1.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.3.1 2023-11-25T06:23:08,415 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/ac/97/ae7da8ea4fc016b323c91ae0e3b5fbd6015207146d8ec40e07856a0cb1f6/flash_attention_softmax_n-0.3.1-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,415 Found link https://files.pythonhosted.org/packages/c1/f2/4d5520611214fe18a8163f798532b5b7d43fca191b11a811f0441135cc83/flash-attention-softmax-n-0.3.2.tar.gz (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9), version: 0.3.2 2023-11-25T06:23:08,416 Skipping link: No binaries permitted for flash-attention-softmax-n: https://files.pythonhosted.org/packages/cd/a6/ff3a922ebc2ca1a51e7974112fb685411e18fed718f5eb428242f3048b26/flash_attention_softmax_n-0.3.2-py3-none-any.whl (from https://pypi.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,417 Fetching project page and analyzing links: https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-11-25T06:23:08,417 Getting page https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-11-25T06:23:08,419 Found index url https://www.piwheels.org/simple/ 2023-11-25T06:23:08,608 Fetched page https://www.piwheels.org/simple/flash-attention-softmax-n/ as text/html 2023-11-25T06:23:08,612 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.3.1-py3-none-any.whl#sha256=99f48a8df88cbe2a9c307e4b69e6eea382fb91be219264e4e682ef604e2632d5 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,614 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.3.0-py3-none-any.whl#sha256=021016968ea910c2d56f23ff965a97fa4fea5dc97f37a5271a5d3dbd3cdf1a87 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,614 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.2.1-py3-none-any.whl#sha256=e41d35be2f3fea375a98acf80f6f4eb013e55c39b9ba231c4c0a12d046610755 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,615 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.2.0-py3-none-any.whl#sha256=5a2688132ed69a84e44af1726e418a63020dcbbc685982ac369747bcd9e8808c (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,615 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.4-py3-none-any.whl#sha256=16f99402e918bba0cc9a6666cd3c06693ea0779f5d9d890d469fa18c466e4fce (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,616 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.3-py3-none-any.whl#sha256=330330ff2c854d62599ba39620220bdd55b187d6680e9027ed0486cc1077cf04 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,616 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.2-py3-none-any.whl#sha256=54a16bb63bb01235b3b4b49d98b7e1c20b6108f3ba638e8928e999a1a631e676 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,617 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.1-py3-none-any.whl#sha256=d906038cc35131336cc7a324748ca86291c64e02cfebcb61345c4f951f35201e (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,617 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.0-py3-none-any.whl#sha256=730f00afb82c068e4a9456e255e7a2bc68f2f4acb76668c32a6a745b3a784896 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,618 Skipping link: No binaries permitted for flash-attention-softmax-n: https://www.piwheels.org/simple/flash-attention-softmax-n/flash_attention_softmax_n-0.1.0rc6-py3-none-any.whl#sha256=2cd65986cf280ff10cef44640cfd6f5a7ecead0b80f310cf10b3404bd04953b6 (from https://www.piwheels.org/simple/flash-attention-softmax-n/) (requires-python:>=3.9) 2023-11-25T06:23:08,618 Skipping link: not a file: https://www.piwheels.org/simple/flash-attention-softmax-n/ 2023-11-25T06:23:08,619 Skipping link: not a file: https://pypi.org/simple/flash-attention-softmax-n/ 2023-11-25T06:23:08,640 Given no hashes to check 1 links for project 'flash-attention-softmax-n': discarding no candidates 2023-11-25T06:23:08,660 Collecting flash-attention-softmax-n==0.3.2 2023-11-25T06:23:08,663 Created temporary directory: /tmp/pip-unpack-8u5ylp6y 2023-11-25T06:23:08,800 Downloading flash-attention-softmax-n-0.3.2.tar.gz (32 kB) 2023-11-25T06:23:08,921 Added flash-attention-softmax-n==0.3.2 from https://files.pythonhosted.org/packages/c1/f2/4d5520611214fe18a8163f798532b5b7d43fca191b11a811f0441135cc83/flash-attention-softmax-n-0.3.2.tar.gz to build tracker '/tmp/pip-build-tracker-de_ayto_' 2023-11-25T06:23:08,924 Running setup.py (path:/tmp/pip-wheel-v8evf89s/flash-attention-softmax-n_a9278c23be2145ec9524985b17b1834c/setup.py) egg_info for package flash-attention-softmax-n 2023-11-25T06:23:08,925 Created temporary directory: /tmp/pip-pip-egg-info-iuihf9rb 2023-11-25T06:23:08,925 Preparing metadata (setup.py): started 2023-11-25T06:23:08,927 Running command python setup.py egg_info 2023-11-25T06:23:12,365 running egg_info 2023-11-25T06:23:12,366 creating /tmp/pip-pip-egg-info-iuihf9rb/flash_attention_softmax_n.egg-info 2023-11-25T06:23:12,393 writing /tmp/pip-pip-egg-info-iuihf9rb/flash_attention_softmax_n.egg-info/PKG-INFO 2023-11-25T06:23:12,398 writing dependency_links to /tmp/pip-pip-egg-info-iuihf9rb/flash_attention_softmax_n.egg-info/dependency_links.txt 2023-11-25T06:23:12,401 writing requirements to /tmp/pip-pip-egg-info-iuihf9rb/flash_attention_softmax_n.egg-info/requires.txt 2023-11-25T06:23:12,404 writing top-level names to /tmp/pip-pip-egg-info-iuihf9rb/flash_attention_softmax_n.egg-info/top_level.txt 2023-11-25T06:23:12,405 writing manifest file '/tmp/pip-pip-egg-info-iuihf9rb/flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-11-25T06:23:12,489 reading manifest file '/tmp/pip-pip-egg-info-iuihf9rb/flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-11-25T06:23:12,492 adding license file 'LICENSE' 2023-11-25T06:23:12,494 writing manifest file '/tmp/pip-pip-egg-info-iuihf9rb/flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-11-25T06:23:12,607 Preparing metadata (setup.py): finished with status 'done' 2023-11-25T06:23:12,612 Source in /tmp/pip-wheel-v8evf89s/flash-attention-softmax-n_a9278c23be2145ec9524985b17b1834c has version 0.3.2, which satisfies requirement flash-attention-softmax-n==0.3.2 from https://files.pythonhosted.org/packages/c1/f2/4d5520611214fe18a8163f798532b5b7d43fca191b11a811f0441135cc83/flash-attention-softmax-n-0.3.2.tar.gz 2023-11-25T06:23:12,613 Removed flash-attention-softmax-n==0.3.2 from https://files.pythonhosted.org/packages/c1/f2/4d5520611214fe18a8163f798532b5b7d43fca191b11a811f0441135cc83/flash-attention-softmax-n-0.3.2.tar.gz from build tracker '/tmp/pip-build-tracker-de_ayto_' 2023-11-25T06:23:12,620 Created temporary directory: /tmp/pip-unpack-8e0mhvsz 2023-11-25T06:23:12,621 Created temporary directory: /tmp/pip-unpack-yku0zvag 2023-11-25T06:23:12,637 Building wheels for collected packages: flash-attention-softmax-n 2023-11-25T06:23:12,646 Created temporary directory: /tmp/pip-wheel-ihmk7hx_ 2023-11-25T06:23:12,646 Building wheel for flash-attention-softmax-n (setup.py): started 2023-11-25T06:23:12,648 Destination directory: /tmp/pip-wheel-ihmk7hx_ 2023-11-25T06:23:12,648 Running command python setup.py bdist_wheel 2023-11-25T06:23:14,207 running bdist_wheel 2023-11-25T06:23:14,353 running build 2023-11-25T06:23:14,354 running build_py 2023-11-25T06:23:14,382 creating build 2023-11-25T06:23:14,383 creating build/lib 2023-11-25T06:23:14,384 creating build/lib/flash_attention_softmax_n 2023-11-25T06:23:14,385 copying flash_attention_softmax_n/__init__.py -> build/lib/flash_attention_softmax_n 2023-11-25T06:23:14,387 creating build/lib/flash_attention_softmax_n/analysis 2023-11-25T06:23:14,388 copying flash_attention_softmax_n/analysis/statistics.py -> build/lib/flash_attention_softmax_n/analysis 2023-11-25T06:23:14,390 copying flash_attention_softmax_n/analysis/__init__.py -> build/lib/flash_attention_softmax_n/analysis 2023-11-25T06:23:14,392 copying flash_attention_softmax_n/analysis/io.py -> build/lib/flash_attention_softmax_n/analysis 2023-11-25T06:23:14,394 copying flash_attention_softmax_n/analysis/hooks.py -> build/lib/flash_attention_softmax_n/analysis 2023-11-25T06:23:14,397 creating build/lib/flash_attention_softmax_n/core 2023-11-25T06:23:14,398 copying flash_attention_softmax_n/core/flash_attn.py -> build/lib/flash_attention_softmax_n/core 2023-11-25T06:23:14,400 copying flash_attention_softmax_n/core/functional.py -> build/lib/flash_attention_softmax_n/core 2023-11-25T06:23:14,403 copying flash_attention_softmax_n/core/flash_attn_triton.py -> build/lib/flash_attention_softmax_n/core 2023-11-25T06:23:14,405 copying flash_attention_softmax_n/core/__init__.py -> build/lib/flash_attention_softmax_n/core 2023-11-25T06:23:14,410 creating build/lib/flash_attention_softmax_n/surgery 2023-11-25T06:23:14,411 copying flash_attention_softmax_n/surgery/attention_softmax_n.py -> build/lib/flash_attention_softmax_n/surgery 2023-11-25T06:23:14,413 copying flash_attention_softmax_n/surgery/__init__.py -> build/lib/flash_attention_softmax_n/surgery 2023-11-25T06:23:14,416 creating build/lib/flash_attention_softmax_n/surgery/surgery_functions 2023-11-25T06:23:14,417 copying flash_attention_softmax_n/surgery/surgery_functions/utils.py -> build/lib/flash_attention_softmax_n/surgery/surgery_functions 2023-11-25T06:23:14,419 copying flash_attention_softmax_n/surgery/surgery_functions/__init__.py -> build/lib/flash_attention_softmax_n/surgery/surgery_functions 2023-11-25T06:23:14,421 copying flash_attention_softmax_n/surgery/surgery_functions/_bert.py -> build/lib/flash_attention_softmax_n/surgery/surgery_functions 2023-11-25T06:23:14,423 copying flash_attention_softmax_n/surgery/surgery_functions/_xlnet.py -> build/lib/flash_attention_softmax_n/surgery/surgery_functions 2023-11-25T06:23:14,459 /usr/local/lib/python3.11/dist-packages/setuptools/_distutils/cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated. 2023-11-25T06:23:14,460 !! 2023-11-25T06:23:14,461 ******************************************************************************** 2023-11-25T06:23:14,462 Please avoid running ``setup.py`` directly. 2023-11-25T06:23:14,462 Instead, use pypa/build, pypa/installer or other 2023-11-25T06:23:14,463 standards-based tools. 2023-11-25T06:23:14,464 See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. 2023-11-25T06:23:14,465 ******************************************************************************** 2023-11-25T06:23:14,466 !! 2023-11-25T06:23:14,467 self.initialize_options() 2023-11-25T06:23:14,498 installing to build/bdist.linux-armv7l/wheel 2023-11-25T06:23:14,499 running install 2023-11-25T06:23:14,535 running install_lib 2023-11-25T06:23:14,564 creating build/bdist.linux-armv7l 2023-11-25T06:23:14,565 creating build/bdist.linux-armv7l/wheel 2023-11-25T06:23:14,567 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n 2023-11-25T06:23:14,568 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/analysis 2023-11-25T06:23:14,569 copying build/lib/flash_attention_softmax_n/analysis/statistics.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/analysis 2023-11-25T06:23:14,572 copying build/lib/flash_attention_softmax_n/analysis/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/analysis 2023-11-25T06:23:14,574 copying build/lib/flash_attention_softmax_n/analysis/io.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/analysis 2023-11-25T06:23:14,576 copying build/lib/flash_attention_softmax_n/analysis/hooks.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/analysis 2023-11-25T06:23:14,578 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/core 2023-11-25T06:23:14,579 copying build/lib/flash_attention_softmax_n/core/flash_attn.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/core 2023-11-25T06:23:14,582 copying build/lib/flash_attention_softmax_n/core/functional.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/core 2023-11-25T06:23:14,584 copying build/lib/flash_attention_softmax_n/core/flash_attn_triton.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/core 2023-11-25T06:23:14,587 copying build/lib/flash_attention_softmax_n/core/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/core 2023-11-25T06:23:14,589 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery 2023-11-25T06:23:14,590 copying build/lib/flash_attention_softmax_n/surgery/attention_softmax_n.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery 2023-11-25T06:23:14,593 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery/surgery_functions 2023-11-25T06:23:14,594 copying build/lib/flash_attention_softmax_n/surgery/surgery_functions/utils.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery/surgery_functions 2023-11-25T06:23:14,596 copying build/lib/flash_attention_softmax_n/surgery/surgery_functions/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery/surgery_functions 2023-11-25T06:23:14,598 copying build/lib/flash_attention_softmax_n/surgery/surgery_functions/_bert.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery/surgery_functions 2023-11-25T06:23:14,600 copying build/lib/flash_attention_softmax_n/surgery/surgery_functions/_xlnet.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery/surgery_functions 2023-11-25T06:23:14,602 copying build/lib/flash_attention_softmax_n/surgery/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n/surgery 2023-11-25T06:23:14,604 copying build/lib/flash_attention_softmax_n/__init__.py -> build/bdist.linux-armv7l/wheel/flash_attention_softmax_n 2023-11-25T06:23:14,605 running install_egg_info 2023-11-25T06:23:14,642 running egg_info 2023-11-25T06:23:14,669 writing flash_attention_softmax_n.egg-info/PKG-INFO 2023-11-25T06:23:16,625 writing dependency_links to flash_attention_softmax_n.egg-info/dependency_links.txt 2023-11-25T06:23:16,630 writing requirements to flash_attention_softmax_n.egg-info/requires.txt 2023-11-25T06:23:16,633 writing top-level names to flash_attention_softmax_n.egg-info/top_level.txt 2023-11-25T06:23:16,699 reading manifest file 'flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-11-25T06:23:16,703 adding license file 'LICENSE' 2023-11-25T06:23:16,708 writing manifest file 'flash_attention_softmax_n.egg-info/SOURCES.txt' 2023-11-25T06:23:16,711 Copying flash_attention_softmax_n.egg-info to build/bdist.linux-armv7l/wheel/flash_attention_softmax_n-0.3.2-py3.11.egg-info 2023-11-25T06:23:16,727 running install_scripts 2023-11-25T06:23:16,745 creating build/bdist.linux-armv7l/wheel/flash_attention_softmax_n-0.3.2.dist-info/WHEEL 2023-11-25T06:23:16,748 creating '/tmp/pip-wheel-ihmk7hx_/flash_attention_softmax_n-0.3.2-py3-none-any.whl' and adding 'build/bdist.linux-armv7l/wheel' to it 2023-11-25T06:23:16,750 adding 'flash_attention_softmax_n/__init__.py' 2023-11-25T06:23:16,752 adding 'flash_attention_softmax_n/analysis/__init__.py' 2023-11-25T06:23:16,754 adding 'flash_attention_softmax_n/analysis/hooks.py' 2023-11-25T06:23:16,755 adding 'flash_attention_softmax_n/analysis/io.py' 2023-11-25T06:23:16,756 adding 'flash_attention_softmax_n/analysis/statistics.py' 2023-11-25T06:23:16,758 adding 'flash_attention_softmax_n/core/__init__.py' 2023-11-25T06:23:16,760 adding 'flash_attention_softmax_n/core/flash_attn.py' 2023-11-25T06:23:16,762 adding 'flash_attention_softmax_n/core/flash_attn_triton.py' 2023-11-25T06:23:16,764 adding 'flash_attention_softmax_n/core/functional.py' 2023-11-25T06:23:16,766 adding 'flash_attention_softmax_n/surgery/__init__.py' 2023-11-25T06:23:16,767 adding 'flash_attention_softmax_n/surgery/attention_softmax_n.py' 2023-11-25T06:23:16,769 adding 'flash_attention_softmax_n/surgery/surgery_functions/__init__.py' 2023-11-25T06:23:16,771 adding 'flash_attention_softmax_n/surgery/surgery_functions/_bert.py' 2023-11-25T06:23:16,772 adding 'flash_attention_softmax_n/surgery/surgery_functions/_xlnet.py' 2023-11-25T06:23:16,776 adding 'flash_attention_softmax_n/surgery/surgery_functions/utils.py' 2023-11-25T06:23:16,783 adding 'flash_attention_softmax_n-0.3.2.dist-info/LICENSE' 2023-11-25T06:23:16,785 adding 'flash_attention_softmax_n-0.3.2.dist-info/METADATA' 2023-11-25T06:23:16,786 adding 'flash_attention_softmax_n-0.3.2.dist-info/WHEEL' 2023-11-25T06:23:16,787 adding 'flash_attention_softmax_n-0.3.2.dist-info/top_level.txt' 2023-11-25T06:23:16,788 adding 'flash_attention_softmax_n-0.3.2.dist-info/RECORD' 2023-11-25T06:23:16,790 removing build/bdist.linux-armv7l/wheel 2023-11-25T06:23:16,913 Building wheel for flash-attention-softmax-n (setup.py): finished with status 'done' 2023-11-25T06:23:16,916 Created wheel for flash-attention-softmax-n: filename=flash_attention_softmax_n-0.3.2-py3-none-any.whl size=34627 sha256=f3f4f25380f32153d5c5a57d0ebbcc268f975337288298c77684278354eeedd2 2023-11-25T06:23:16,917 Stored in directory: /tmp/pip-ephem-wheel-cache-ktl8mf01/wheels/6a/21/86/37663a65d1c51f49c55f4a18b0e0af7f7fb70111d5ed5f4444 2023-11-25T06:23:16,930 Successfully built flash-attention-softmax-n 2023-11-25T06:23:16,934 Removed build tracker: '/tmp/pip-build-tracker-de_ayto_'