2025-09-18T00:21:33,118 Created temporary directory: /tmp/pip-ephem-wheel-cache-gykymrb4 2025-09-18T00:21:33,121 Created temporary directory: /tmp/pip-req-tracker-efrjq3nf 2025-09-18T00:21:33,121 Initialized build tracking at /tmp/pip-req-tracker-efrjq3nf 2025-09-18T00:21:33,121 Created build tracker: /tmp/pip-req-tracker-efrjq3nf 2025-09-18T00:21:33,121 Entered build tracker: /tmp/pip-req-tracker-efrjq3nf 2025-09-18T00:21:33,122 Created temporary directory: /tmp/pip-wheel-d_6qw_5l 2025-09-18T00:21:33,143 Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple, https://www.piwheels.org/simple 2025-09-18T00:21:33,145 2 location(s) to search for versions of flash-linear-attention: 2025-09-18T00:21:33,145 * https://pypi.org/simple/flash-linear-attention/ 2025-09-18T00:21:33,145 * https://www.piwheels.org/simple/flash-linear-attention/ 2025-09-18T00:21:33,146 Fetching project page and analyzing links: https://pypi.org/simple/flash-linear-attention/ 2025-09-18T00:21:33,146 Getting page https://pypi.org/simple/flash-linear-attention/ 2025-09-18T00:21:33,147 Found index url https://pypi.org/simple 2025-09-18T00:21:33,152 Starting new HTTPS connection (1): pypi.org:443 2025-09-18T00:21:33,194 https://pypi.org:443 "GET /simple/flash-linear-attention/ HTTP/1.1" 200 2430 2025-09-18T00:21:33,273 Link requires a different Python (3.9.2 not in: '>=3.10'): https://files.pythonhosted.org/packages/b0/ee/2b4cd1048125d51d3ff5a24f319e45b597f8d3db3c6abdc33516f5e85dd6/flash-linear-attention-0.1.tar.gz#sha256=aba21d6c2004081dbf19336d804c10ba13e8807f2b3df3286dbb5b63e334a164 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,274 Skipping link: No binaries permitted for flash-linear-attention: https://files.pythonhosted.org/packages/3b/4b/cd1053b9d113abb855f6bcf9a9d9ceee5c6bab69cd646187aa583f66661f/flash_linear_attention-0.1-py3-none-any.whl#sha256=089ff221613a9fa396e01a330a012f63b40c0b8fb18c8264f8e2b36939eaa34d (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,274 Link requires a different Python (3.9.2 not in: '>=3.10'): https://files.pythonhosted.org/packages/a7/45/5f7220226495877ae28988f027e1b2f91931d84e16ffb9462e80a5a4f241/flash-linear-attention-0.1.1.tar.gz#sha256=e5ef08529fbb74cb5143747452f6a460c2759f72b7e349ae0f2bb045368ba308 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,275 Skipping link: No binaries permitted for flash-linear-attention: https://files.pythonhosted.org/packages/81/f3/bdbd82e1d1c997a4300b88f1b780bcdb63de1fb8e604bd21eea3a2a134f2/flash_linear_attention-0.1.1-py3-none-any.whl#sha256=75346f241005be44b3f3f6df792aa81d865d3f4ab0c0f19c74ece2c39d63482d (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,275 Skipping link: No binaries permitted for flash-linear-attention: https://files.pythonhosted.org/packages/9f/b0/c4e740e37b7cb5863b007722d30307956c4d22f8cd01156e567502ef96bf/flash_linear_attention-0.1.2-py3-none-any.whl#sha256=c823924cb1e8f53f27fe3588a5f34792ddd6545a0212ae462da2e4d79cac1f59 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,276 Link requires a different Python (3.9.2 not in: '>=3.10'): https://files.pythonhosted.org/packages/5e/c7/59fc20e045424dc77129b2b94004d752142054232180d1220a2a68d6ade4/flash_linear_attention-0.1.2.tar.gz#sha256=b09bde56c0d994f8b4bf490e33e5b4a8c1b7e9af2359b8670a5da5a26e3821e3 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,276 Skipping link: No binaries permitted for flash-linear-attention: https://files.pythonhosted.org/packages/8e/8d/154ea1c30f6a7f040d8a92f5f4cb03ba79d2474f587176bc6a342e15a4fc/flash_linear_attention-0.2.0-py3-none-any.whl#sha256=f526fbe287f60071e4b85fac18593dd143b0d96d6303b533326e9fd61afb80c5 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,276 Link requires a different Python (3.9.2 not in: '>=3.10'): https://files.pythonhosted.org/packages/5e/b2/9f29181db054f7e2ca127de4010a99776425e3b8da0bcd41bff59256f905/flash_linear_attention-0.2.0.tar.gz#sha256=a0888508daa764cb267625cb751a00db119f3688661836cfd07b36ebcbceafa7 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,277 Skipping link: No binaries permitted for flash-linear-attention: https://files.pythonhosted.org/packages/c2/36/d40758b984036db921ff7c542791272a5f59628585b1518e36d148b06b0a/flash_linear_attention-0.2.1-py3-none-any.whl#sha256=385ba0719b002b43a0c3a8dc2d711c469b1675b78d6424346dd083da20b03e9d (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,277 Link requires a different Python (3.9.2 not in: '>=3.10'): https://files.pythonhosted.org/packages/e2/b8/1b8aa3397ad6bc410df608ccd11c437066a0b2355e51e96bab4b64bdf4ad/flash_linear_attention-0.2.1.tar.gz#sha256=d687425211f04f90e7ee94bbe5848a34a9b7322d588019d7d3ca5d24cbd95124 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,278 Skipping link: No binaries permitted for flash-linear-attention: https://files.pythonhosted.org/packages/be/7e/6b810afd2098726b5b753c56a9a7e1f6cb1dc9dbed048f49948b80c7e436/flash_linear_attention-0.2.2-py3-none-any.whl#sha256=b6a62b140209d1350861d768e9acb3786548a47960cc324aca39de940ca6b83f (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,278 Link requires a different Python (3.9.2 not in: '>=3.10'): https://files.pythonhosted.org/packages/ec/67/9b9d7ddfe879f71d0ba169db04cb631eedda8f9a437c6223439c12dfe148/flash_linear_attention-0.2.2.tar.gz#sha256=fd620a5b7ce1b0280ebefaa437153ea81b1d95794e8e21662a5ec83369d23cea (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,278 Skipping link: No binaries permitted for flash-linear-attention: https://files.pythonhosted.org/packages/ec/16/df1c4d74d647c51c3758f26b50681ec7e92343f0f4e3808b7b60768b795f/flash_linear_attention-0.3.0-py3-none-any.whl#sha256=5ba20ee898fcb7b0505de2cbd8772b1024ef34f9e97e73ca1d669276ca493b42 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,279 Link requires a different Python (3.9.2 not in: '>=3.10'): https://files.pythonhosted.org/packages/6d/03/68de1cdae3c037a0cae44825ee47edefe01d7281df04fb0211432afbd2b7/flash_linear_attention-0.3.0.tar.gz#sha256=e001df933db346353f4e1fc959047318f69e890d8cd34872a01087c7c2e0f5fb (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,279 Skipping link: No binaries permitted for flash-linear-attention: https://files.pythonhosted.org/packages/7d/53/6dbd370bbf39fea35133aa7ebba0a9a7957530f86f3d55baa595c93e734b/flash_linear_attention-0.3.1-py3-none-any.whl#sha256=de6bbd2ffc56ec765a70474d75ee7fac610e9f66ebdccc552fe5c01e016cf5c5 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,280 Link requires a different Python (3.9.2 not in: '>=3.10'): https://files.pythonhosted.org/packages/eb/c8/ac810dd9d4f8ba3db1c78043a08b8a505bdf06814d3c343c429722593da6/flash_linear_attention-0.3.1.tar.gz#sha256=15ad483a8dbed37072dde881408b6f83da3487dc24f8c165a862821c10aebe4b (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,280 Skipping link: No binaries permitted for flash-linear-attention: https://files.pythonhosted.org/packages/a0/d0/35ce9eac5f52c72005095aaa12a393d2656ed7ffedf925b2381a6b76d10c/flash_linear_attention-0.3.2-py3-none-any.whl#sha256=604e73361437ba786420ab195e2caa3fd19280503761e703fa353c5ce5c65376 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,281 Link requires a different Python (3.9.2 not in: '>=3.10'): https://files.pythonhosted.org/packages/84/f6/e62c1e562a288557eba7f06f168a7615813d1a227327b8beb8ba426da2c5/flash_linear_attention-0.3.2.tar.gz#sha256=9147747316c2951fed4ebeb4fa87977c05d807dc70c93b46250b68a6eb1183e2 (from https://pypi.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,281 Fetching project page and analyzing links: https://www.piwheels.org/simple/flash-linear-attention/ 2025-09-18T00:21:33,281 Getting page https://www.piwheels.org/simple/flash-linear-attention/ 2025-09-18T00:21:33,282 Found index url https://www.piwheels.org/simple 2025-09-18T00:21:33,284 Starting new HTTPS connection (1): www.piwheels.org:443 2025-09-18T00:21:33,347 https://www.piwheels.org:443 "GET /simple/flash-linear-attention/ HTTP/1.1" 200 412 2025-09-18T00:21:33,352 Skipping link: No binaries permitted for flash-linear-attention: https://www.piwheels.org/simple/flash-linear-attention/flash_linear_attention-0.3.2-py3-none-any.whl#sha256=e1abdcacf6314b4886190a36699a633bdb5ad9906e3c00fbe74fa29aba18f41d (from https://www.piwheels.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,353 Skipping link: No binaries permitted for flash-linear-attention: https://www.piwheels.org/simple/flash-linear-attention/flash_linear_attention-0.3.1-py3-none-any.whl#sha256=fad0b89f44a79c552fb030d45a41cb2f978dc3b22d25e1b70b50c211dd2ff006 (from https://www.piwheels.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,353 Skipping link: No binaries permitted for flash-linear-attention: https://archive1.piwheels.org/simple/flash-linear-attention/flash_linear_attention-0.3.0-py3-none-any.whl#sha256=083b2e8da45f9ed89b48cea8d6935bcc3b18f08aa015d91c6133eef3e8b47a7d (from https://www.piwheels.org/simple/flash-linear-attention/) (requires-python:>=3.10) 2025-09-18T00:21:33,380 Given no hashes to check 0 links for project 'flash-linear-attention': discarding no candidates 2025-09-18T00:21:33,380 ERROR: Could not find a version that satisfies the requirement flash-linear-attention==0.2.1 2025-09-18T00:21:33,381 ERROR: No matching distribution found for flash-linear-attention==0.2.1 2025-09-18T00:21:33,381 Exception information: 2025-09-18T00:21:33,381 Traceback (most recent call last): 2025-09-18T00:21:33,381 File "/usr/share/python-wheels/resolvelib-0.5.4-py2.py3-none-any.whl/resolvelib/resolvers.py", line 171, in _merge_into_criterion 2025-09-18T00:21:33,381 crit = self.state.criteria[name] 2025-09-18T00:21:33,381 KeyError: 'flash-linear-attention' 2025-09-18T00:21:33,381 2025-09-18T00:21:33,381 During handling of the above exception, another exception occurred: 2025-09-18T00:21:33,381 2025-09-18T00:21:33,381 Traceback (most recent call last): 2025-09-18T00:21:33,381 File "/usr/share/python-wheels/resolvelib-0.5.4-py2.py3-none-any.whl/resolvelib/resolvers.py", line 318, in resolve 2025-09-18T00:21:33,381 name, crit = self._merge_into_criterion(r, parent=None) 2025-09-18T00:21:33,381 File "/usr/share/python-wheels/resolvelib-0.5.4-py2.py3-none-any.whl/resolvelib/resolvers.py", line 173, in _merge_into_criterion 2025-09-18T00:21:33,381 crit = Criterion.from_requirement(self._p, requirement, parent) 2025-09-18T00:21:33,381 File "/usr/share/python-wheels/resolvelib-0.5.4-py2.py3-none-any.whl/resolvelib/resolvers.py", line 83, in from_requirement 2025-09-18T00:21:33,381 raise RequirementsConflicted(criterion) 2025-09-18T00:21:33,381 resolvelib.resolvers.RequirementsConflicted: Requirements conflict: SpecifierRequirement('flash-linear-attention==0.2.1') 2025-09-18T00:21:33,381 2025-09-18T00:21:33,381 During handling of the above exception, another exception occurred: 2025-09-18T00:21:33,381 2025-09-18T00:21:33,381 Traceback (most recent call last): 2025-09-18T00:21:33,381 File "/usr/lib/python3/dist-packages/pip/_internal/resolution/resolvelib/resolver.py", line 122, in resolve 2025-09-18T00:21:33,381 self._result = resolver.resolve( 2025-09-18T00:21:33,381 File "/usr/share/python-wheels/resolvelib-0.5.4-py2.py3-none-any.whl/resolvelib/resolvers.py", line 453, in resolve 2025-09-18T00:21:33,381 state = resolution.resolve(requirements, max_rounds=max_rounds) 2025-09-18T00:21:33,381 File "/usr/share/python-wheels/resolvelib-0.5.4-py2.py3-none-any.whl/resolvelib/resolvers.py", line 320, in resolve 2025-09-18T00:21:33,381 raise ResolutionImpossible(e.criterion.information) 2025-09-18T00:21:33,381 resolvelib.resolvers.ResolutionImpossible: [RequirementInformation(requirement=SpecifierRequirement('flash-linear-attention==0.2.1'), parent=None)] 2025-09-18T00:21:33,381 2025-09-18T00:21:33,381 The above exception was the direct cause of the following exception: 2025-09-18T00:21:33,381 2025-09-18T00:21:33,381 Traceback (most recent call last): 2025-09-18T00:21:33,381 File "/usr/lib/python3/dist-packages/pip/_internal/cli/base_command.py", line 223, in _main 2025-09-18T00:21:33,381 status = self.run(options, args) 2025-09-18T00:21:33,381 File "/usr/lib/python3/dist-packages/pip/_internal/cli/req_command.py", line 180, in wrapper 2025-09-18T00:21:33,381 return func(self, options, args) 2025-09-18T00:21:33,381 File "/usr/lib/python3/dist-packages/pip/_internal/commands/wheel.py", line 162, in run 2025-09-18T00:21:33,381 requirement_set = resolver.resolve( 2025-09-18T00:21:33,381 File "/usr/lib/python3/dist-packages/pip/_internal/resolution/resolvelib/resolver.py", line 128, in resolve 2025-09-18T00:21:33,381 six.raise_from(error, e) 2025-09-18T00:21:33,381 File "", line 3, in raise_from 2025-09-18T00:21:33,381 pip._internal.exceptions.DistributionNotFound: No matching distribution found for flash-linear-attention==0.2.1 2025-09-18T00:21:33,385 Removed build tracker: '/tmp/pip-req-tracker-efrjq3nf'