2025-12-23T20:48:02,439 Created temporary directory: /tmp/pip-ephem-wheel-cache-ghizpd_g 2025-12-23T20:48:02,441 Created temporary directory: /tmp/pip-build-tracker-70spwcw8 2025-12-23T20:48:02,441 Initialized build tracking at /tmp/pip-build-tracker-70spwcw8 2025-12-23T20:48:02,442 Created build tracker: /tmp/pip-build-tracker-70spwcw8 2025-12-23T20:48:02,442 Entered build tracker: /tmp/pip-build-tracker-70spwcw8 2025-12-23T20:48:02,443 Created temporary directory: /tmp/pip-wheel-9mpog2dn 2025-12-23T20:48:02,446 DEPRECATION: --no-binary currently disables reading from the cache of locally built wheels. In the future --no-binary will not influence the wheel cache. pip 23.1 will enforce this behaviour change. A possible replacement is to use the --no-cache-dir option. You can use the flag --use-feature=no-binary-enable-wheel-cache to test the upcoming behaviour. Discussion can be found at https://github.com/pypa/pip/issues/11453 2025-12-23T20:48:02,449 Created temporary directory: /tmp/pip-ephem-wheel-cache-lfw2f70i 2025-12-23T20:48:02,474 Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple 2025-12-23T20:48:02,478 2 location(s) to search for versions of llm-inference-engine: 2025-12-23T20:48:02,478 * https://pypi.org/simple/llm-inference-engine/ 2025-12-23T20:48:02,478 * https://www.piwheels.org/simple/llm-inference-engine/ 2025-12-23T20:48:02,479 Fetching project page and analyzing links: https://pypi.org/simple/llm-inference-engine/ 2025-12-23T20:48:02,480 Getting page https://pypi.org/simple/llm-inference-engine/ 2025-12-23T20:48:02,481 Found index url https://pypi.org/simple 2025-12-23T20:48:02,699 Fetched page https://pypi.org/simple/llm-inference-engine/ as application/vnd.pypi.simple.v1+json 2025-12-23T20:48:02,701 Skipping link: No binaries permitted for llm-inference-engine: https://files.pythonhosted.org/packages/fa/f7/c4c711d8734811e438edb9bb9499caeefbf5efeb1e78bc9504cca7a5aac3/llm_inference_engine-0.1.0-py3-none-any.whl (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11) 2025-12-23T20:48:02,702 Found link https://files.pythonhosted.org/packages/a2/e9/3eebe7915f5f43fdc0fcde496c8151701785895c64791bf379a9c2de9f4b/llm_inference_engine-0.1.0.tar.gz (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11), version: 0.1.0 2025-12-23T20:48:02,702 Skipping link: No binaries permitted for llm-inference-engine: https://files.pythonhosted.org/packages/3f/3a/a6a7b6748e1e0a323d321617effc6076cb3c1ecf90c009d5cac666a7a709/llm_inference_engine-0.1.1-py3-none-any.whl (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11) 2025-12-23T20:48:02,704 Found link https://files.pythonhosted.org/packages/0e/40/59fa25293e1b212efed1641d3884f4ef8de72dbf66bae24146d34288fe80/llm_inference_engine-0.1.1.tar.gz (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11), version: 0.1.1 2025-12-23T20:48:02,704 Fetching project page and analyzing links: https://www.piwheels.org/simple/llm-inference-engine/ 2025-12-23T20:48:02,705 Getting page https://www.piwheels.org/simple/llm-inference-engine/ 2025-12-23T20:48:02,706 Found index url https://www.piwheels.org/simple 2025-12-23T20:48:02,865 Fetched page https://www.piwheels.org/simple/llm-inference-engine/ as text/html 2025-12-23T20:48:02,867 Skipping link: No binaries permitted for llm-inference-engine: https://www.piwheels.org/simple/llm-inference-engine/llm_inference_engine-0.1.0-py3-none-any.whl#sha256=567bada763146e979aca8976400d6255a42006fd5456d1c9364b975ed2a5f720 (from https://www.piwheels.org/simple/llm-inference-engine/) (requires-python:>=3.11,<4.0) 2025-12-23T20:48:02,867 Skipping link: not a file: https://www.piwheels.org/simple/llm-inference-engine/ 2025-12-23T20:48:02,868 Skipping link: not a file: https://pypi.org/simple/llm-inference-engine/ 2025-12-23T20:48:02,887 Given no hashes to check 1 links for project 'llm-inference-engine': discarding no candidates 2025-12-23T20:48:02,905 Collecting llm-inference-engine==0.1.1 2025-12-23T20:48:02,907 Created temporary directory: /tmp/pip-unpack-urizz_w5 2025-12-23T20:48:03,119 Downloading llm_inference_engine-0.1.1.tar.gz (9.0 kB) 2025-12-23T20:48:03,143 Added llm-inference-engine==0.1.1 from https://files.pythonhosted.org/packages/0e/40/59fa25293e1b212efed1641d3884f4ef8de72dbf66bae24146d34288fe80/llm_inference_engine-0.1.1.tar.gz to build tracker '/tmp/pip-build-tracker-70spwcw8' 2025-12-23T20:48:03,147 Created temporary directory: /tmp/pip-build-env-xhj8cjo9 2025-12-23T20:48:03,151 Installing build dependencies: started 2025-12-23T20:48:03,153 Running command pip subprocess to install build dependencies 2025-12-23T20:48:04,293 Using pip 23.0.1 from /usr/lib/python3/dist-packages/pip (python 3.11) 2025-12-23T20:48:04,920 DEPRECATION: --no-binary currently disables reading from the cache of locally built wheels. In the future --no-binary will not influence the wheel cache. pip 23.1 will enforce this behaviour change. A possible replacement is to use the --no-cache-dir option. You can use the flag --use-feature=no-binary-enable-wheel-cache to test the upcoming behaviour. Discussion can be found at https://github.com/pypa/pip/issues/11453 2025-12-23T20:48:04,942 Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple 2025-12-23T20:48:05,467 Collecting poetry-core 2025-12-23T20:48:05,501 Using cached https://www.piwheels.org/simple/poetry-core/poetry_core-2.2.1-py3-none-any.whl (338 kB) 2025-12-23T20:48:07,998 Installing collected packages: poetry-core 2025-12-23T20:48:09,277 Successfully installed poetry-core-2.2.1 2025-12-23T20:48:09,514 Installing build dependencies: finished with status 'done' 2025-12-23T20:48:09,521 Getting requirements to build wheel: started 2025-12-23T20:48:09,523 Running command Getting requirements to build wheel 2025-12-23T20:48:10,010 Getting requirements to build wheel: finished with status 'done' 2025-12-23T20:48:10,014 Created temporary directory: /tmp/pip-modern-metadata-q4imk1fo 2025-12-23T20:48:10,016 Preparing metadata (pyproject.toml): started 2025-12-23T20:48:10,017 Running command Preparing metadata (pyproject.toml) 2025-12-23T20:48:10,709 Preparing metadata (pyproject.toml): finished with status 'done' 2025-12-23T20:48:10,714 Source in /tmp/pip-wheel-9mpog2dn/llm-inference-engine_73647b13df5041f9b6b5418a3a866c4d has version 0.1.1, which satisfies requirement llm-inference-engine==0.1.1 from https://files.pythonhosted.org/packages/0e/40/59fa25293e1b212efed1641d3884f4ef8de72dbf66bae24146d34288fe80/llm_inference_engine-0.1.1.tar.gz 2025-12-23T20:48:10,715 Removed llm-inference-engine==0.1.1 from https://files.pythonhosted.org/packages/0e/40/59fa25293e1b212efed1641d3884f4ef8de72dbf66bae24146d34288fe80/llm_inference_engine-0.1.1.tar.gz from build tracker '/tmp/pip-build-tracker-70spwcw8' 2025-12-23T20:48:10,720 Created temporary directory: /tmp/pip-unpack-ology92o 2025-12-23T20:48:10,720 Building wheels for collected packages: llm-inference-engine 2025-12-23T20:48:10,725 Created temporary directory: /tmp/pip-wheel-_uoh98vo 2025-12-23T20:48:10,726 Destination directory: /tmp/pip-wheel-_uoh98vo 2025-12-23T20:48:10,728 Building wheel for llm-inference-engine (pyproject.toml): started 2025-12-23T20:48:10,730 Running command Building wheel for llm-inference-engine (pyproject.toml) 2025-12-23T20:48:11,451 Building wheel for llm-inference-engine (pyproject.toml): finished with status 'done' 2025-12-23T20:48:11,455 Created wheel for llm-inference-engine: filename=llm_inference_engine-0.1.1-py3-none-any.whl size=10537 sha256=13de0552e2cd23270bcab4b8693e608bf6b51d5bece9f7317609c5441d7a185c 2025-12-23T20:48:11,456 Stored in directory: /tmp/pip-ephem-wheel-cache-lfw2f70i/wheels/3c/b9/a8/455b614adf70f839d532081f6add311215f956f33fa06f165b 2025-12-23T20:48:11,465 Successfully built llm-inference-engine 2025-12-23T20:48:11,468 Removed build tracker: '/tmp/pip-build-tracker-70spwcw8'