2026-01-01T06:10:27,905 Created temporary directory: /tmp/pip-ephem-wheel-cache-1s6b03yy 2026-01-01T06:10:27,906 Created temporary directory: /tmp/pip-build-tracker-f9r_1ph2 2026-01-01T06:10:27,907 Initialized build tracking at /tmp/pip-build-tracker-f9r_1ph2 2026-01-01T06:10:27,908 Created build tracker: /tmp/pip-build-tracker-f9r_1ph2 2026-01-01T06:10:27,908 Entered build tracker: /tmp/pip-build-tracker-f9r_1ph2 2026-01-01T06:10:27,909 Created temporary directory: /tmp/pip-wheel-jmbceufo 2026-01-01T06:10:27,912 DEPRECATION: --no-binary currently disables reading from the cache of locally built wheels. In the future --no-binary will not influence the wheel cache. pip 23.1 will enforce this behaviour change. A possible replacement is to use the --no-cache-dir option. You can use the flag --use-feature=no-binary-enable-wheel-cache to test the upcoming behaviour. Discussion can be found at https://github.com/pypa/pip/issues/11453 2026-01-01T06:10:27,914 Created temporary directory: /tmp/pip-ephem-wheel-cache-pq6ph0mu 2026-01-01T06:10:27,937 Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple 2026-01-01T06:10:27,941 2 location(s) to search for versions of llm-inference-engine: 2026-01-01T06:10:27,941 * https://pypi.org/simple/llm-inference-engine/ 2026-01-01T06:10:27,941 * https://www.piwheels.org/simple/llm-inference-engine/ 2026-01-01T06:10:27,942 Fetching project page and analyzing links: https://pypi.org/simple/llm-inference-engine/ 2026-01-01T06:10:27,943 Getting page https://pypi.org/simple/llm-inference-engine/ 2026-01-01T06:10:27,944 Found index url https://pypi.org/simple 2026-01-01T06:10:28,154 Fetched page https://pypi.org/simple/llm-inference-engine/ as application/vnd.pypi.simple.v1+json 2026-01-01T06:10:28,156 Skipping link: No binaries permitted for llm-inference-engine: https://files.pythonhosted.org/packages/fa/f7/c4c711d8734811e438edb9bb9499caeefbf5efeb1e78bc9504cca7a5aac3/llm_inference_engine-0.1.0-py3-none-any.whl (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11) 2026-01-01T06:10:28,157 Found link https://files.pythonhosted.org/packages/a2/e9/3eebe7915f5f43fdc0fcde496c8151701785895c64791bf379a9c2de9f4b/llm_inference_engine-0.1.0.tar.gz (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11), version: 0.1.0 2026-01-01T06:10:28,158 Skipping link: No binaries permitted for llm-inference-engine: https://files.pythonhosted.org/packages/3f/3a/a6a7b6748e1e0a323d321617effc6076cb3c1ecf90c009d5cac666a7a709/llm_inference_engine-0.1.1-py3-none-any.whl (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11) 2026-01-01T06:10:28,159 Found link https://files.pythonhosted.org/packages/0e/40/59fa25293e1b212efed1641d3884f4ef8de72dbf66bae24146d34288fe80/llm_inference_engine-0.1.1.tar.gz (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11), version: 0.1.1 2026-01-01T06:10:28,160 Skipping link: No binaries permitted for llm-inference-engine: https://files.pythonhosted.org/packages/04/11/e9d15356c6194527d1abf49b0f41f52cf5ca9e4ced4576394448bde9e987/llm_inference_engine-0.1.2-py3-none-any.whl (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11) 2026-01-01T06:10:28,161 Found link https://files.pythonhosted.org/packages/69/fc/a0134f2249ab493695c819c519121007c3f44cd5e2c50bf00ad0767dfa72/llm_inference_engine-0.1.2.tar.gz (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11), version: 0.1.2 2026-01-01T06:10:28,162 Fetching project page and analyzing links: https://www.piwheels.org/simple/llm-inference-engine/ 2026-01-01T06:10:28,163 Getting page https://www.piwheels.org/simple/llm-inference-engine/ 2026-01-01T06:10:28,164 Found index url https://www.piwheels.org/simple 2026-01-01T06:10:28,470 Fetched page https://www.piwheels.org/simple/llm-inference-engine/ as text/html 2026-01-01T06:10:28,472 Skipping link: No binaries permitted for llm-inference-engine: https://www.piwheels.org/simple/llm-inference-engine/llm_inference_engine-0.1.1-py3-none-any.whl#sha256=13de0552e2cd23270bcab4b8693e608bf6b51d5bece9f7317609c5441d7a185c (from https://www.piwheels.org/simple/llm-inference-engine/) (requires-python:>=3.11,<4.0) 2026-01-01T06:10:28,473 Skipping link: No binaries permitted for llm-inference-engine: https://www.piwheels.org/simple/llm-inference-engine/llm_inference_engine-0.1.0-py3-none-any.whl#sha256=567bada763146e979aca8976400d6255a42006fd5456d1c9364b975ed2a5f720 (from https://www.piwheels.org/simple/llm-inference-engine/) (requires-python:>=3.11,<4.0) 2026-01-01T06:10:28,474 Skipping link: not a file: https://www.piwheels.org/simple/llm-inference-engine/ 2026-01-01T06:10:28,474 Skipping link: not a file: https://pypi.org/simple/llm-inference-engine/ 2026-01-01T06:10:28,493 Given no hashes to check 1 links for project 'llm-inference-engine': discarding no candidates 2026-01-01T06:10:28,511 Collecting llm-inference-engine==0.1.2 2026-01-01T06:10:28,514 Created temporary directory: /tmp/pip-unpack-2rnemplm 2026-01-01T06:10:28,646 Downloading llm_inference_engine-0.1.2.tar.gz (9.5 kB) 2026-01-01T06:10:28,670 Added llm-inference-engine==0.1.2 from https://files.pythonhosted.org/packages/69/fc/a0134f2249ab493695c819c519121007c3f44cd5e2c50bf00ad0767dfa72/llm_inference_engine-0.1.2.tar.gz to build tracker '/tmp/pip-build-tracker-f9r_1ph2' 2026-01-01T06:10:28,674 Created temporary directory: /tmp/pip-build-env-1_7_didu 2026-01-01T06:10:28,678 Installing build dependencies: started 2026-01-01T06:10:28,679 Running command pip subprocess to install build dependencies 2026-01-01T06:10:29,884 Using pip 23.0.1 from /usr/lib/python3/dist-packages/pip (python 3.11) 2026-01-01T06:10:30,551 DEPRECATION: --no-binary currently disables reading from the cache of locally built wheels. In the future --no-binary will not influence the wheel cache. pip 23.1 will enforce this behaviour change. A possible replacement is to use the --no-cache-dir option. You can use the flag --use-feature=no-binary-enable-wheel-cache to test the upcoming behaviour. Discussion can be found at https://github.com/pypa/pip/issues/11453 2026-01-01T06:10:30,576 Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple 2026-01-01T06:10:31,102 Collecting poetry-core 2026-01-01T06:10:31,140 Using cached https://www.piwheels.org/simple/poetry-core/poetry_core-2.2.1-py3-none-any.whl (338 kB) 2026-01-01T06:10:33,702 Installing collected packages: poetry-core 2026-01-01T06:10:34,917 Successfully installed poetry-core-2.2.1 2026-01-01T06:10:35,149 Installing build dependencies: finished with status 'done' 2026-01-01T06:10:35,155 Getting requirements to build wheel: started 2026-01-01T06:10:35,156 Running command Getting requirements to build wheel 2026-01-01T06:10:35,609 Getting requirements to build wheel: finished with status 'done' 2026-01-01T06:10:35,612 Created temporary directory: /tmp/pip-modern-metadata-5nnf5er_ 2026-01-01T06:10:35,614 Preparing metadata (pyproject.toml): started 2026-01-01T06:10:35,615 Running command Preparing metadata (pyproject.toml) 2026-01-01T06:10:36,293 Preparing metadata (pyproject.toml): finished with status 'done' 2026-01-01T06:10:36,297 Source in /tmp/pip-wheel-jmbceufo/llm-inference-engine_69c2d92d5f914d9c9f3e7a9f3d3f298c has version 0.1.2, which satisfies requirement llm-inference-engine==0.1.2 from https://files.pythonhosted.org/packages/69/fc/a0134f2249ab493695c819c519121007c3f44cd5e2c50bf00ad0767dfa72/llm_inference_engine-0.1.2.tar.gz 2026-01-01T06:10:36,298 Removed llm-inference-engine==0.1.2 from https://files.pythonhosted.org/packages/69/fc/a0134f2249ab493695c819c519121007c3f44cd5e2c50bf00ad0767dfa72/llm_inference_engine-0.1.2.tar.gz from build tracker '/tmp/pip-build-tracker-f9r_1ph2' 2026-01-01T06:10:36,304 Created temporary directory: /tmp/pip-unpack-uhaovci3 2026-01-01T06:10:36,304 Building wheels for collected packages: llm-inference-engine 2026-01-01T06:10:36,309 Created temporary directory: /tmp/pip-wheel-15c0zqt9 2026-01-01T06:10:36,309 Destination directory: /tmp/pip-wheel-15c0zqt9 2026-01-01T06:10:36,312 Building wheel for llm-inference-engine (pyproject.toml): started 2026-01-01T06:10:36,314 Running command Building wheel for llm-inference-engine (pyproject.toml) 2026-01-01T06:10:37,017 Building wheel for llm-inference-engine (pyproject.toml): finished with status 'done' 2026-01-01T06:10:37,021 Created wheel for llm-inference-engine: filename=llm_inference_engine-0.1.2-py3-none-any.whl size=11010 sha256=78b1b95da7785ddfb70c025bac9063d099a83eb028a32d6c855ce87850f792c2 2026-01-01T06:10:37,023 Stored in directory: /tmp/pip-ephem-wheel-cache-pq6ph0mu/wheels/71/8c/9f/a37d3eabb25073e7e853e315ce1c7d0b142dba0497f0945917 2026-01-01T06:10:37,031 Successfully built llm-inference-engine 2026-01-01T06:10:37,034 Removed build tracker: '/tmp/pip-build-tracker-f9r_1ph2'