2025-12-23T02:45:10,565 Created temporary directory: /tmp/pip-ephem-wheel-cache-m91e4nvs 2025-12-23T02:45:10,567 Created temporary directory: /tmp/pip-build-tracker-ulkcvmq3 2025-12-23T02:45:10,568 Initialized build tracking at /tmp/pip-build-tracker-ulkcvmq3 2025-12-23T02:45:10,568 Created build tracker: /tmp/pip-build-tracker-ulkcvmq3 2025-12-23T02:45:10,569 Entered build tracker: /tmp/pip-build-tracker-ulkcvmq3 2025-12-23T02:45:10,570 Created temporary directory: /tmp/pip-wheel-yezbu2j9 2025-12-23T02:45:10,572 DEPRECATION: --no-binary currently disables reading from the cache of locally built wheels. In the future --no-binary will not influence the wheel cache. pip 23.1 will enforce this behaviour change. A possible replacement is to use the --no-cache-dir option. You can use the flag --use-feature=no-binary-enable-wheel-cache to test the upcoming behaviour. Discussion can be found at https://github.com/pypa/pip/issues/11453 2025-12-23T02:45:10,575 Created temporary directory: /tmp/pip-ephem-wheel-cache-zlmn3w40 2025-12-23T02:45:10,596 Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple 2025-12-23T02:45:10,600 2 location(s) to search for versions of llm-inference-engine: 2025-12-23T02:45:10,600 * https://pypi.org/simple/llm-inference-engine/ 2025-12-23T02:45:10,600 * https://www.piwheels.org/simple/llm-inference-engine/ 2025-12-23T02:45:10,600 Fetching project page and analyzing links: https://pypi.org/simple/llm-inference-engine/ 2025-12-23T02:45:10,601 Getting page https://pypi.org/simple/llm-inference-engine/ 2025-12-23T02:45:10,603 Found index url https://pypi.org/simple 2025-12-23T02:45:10,810 Fetched page https://pypi.org/simple/llm-inference-engine/ as application/vnd.pypi.simple.v1+json 2025-12-23T02:45:10,812 Skipping link: No binaries permitted for llm-inference-engine: https://files.pythonhosted.org/packages/fa/f7/c4c711d8734811e438edb9bb9499caeefbf5efeb1e78bc9504cca7a5aac3/llm_inference_engine-0.1.0-py3-none-any.whl (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11) 2025-12-23T02:45:10,813 Found link https://files.pythonhosted.org/packages/a2/e9/3eebe7915f5f43fdc0fcde496c8151701785895c64791bf379a9c2de9f4b/llm_inference_engine-0.1.0.tar.gz (from https://pypi.org/simple/llm-inference-engine/) (requires-python:<4.0,>=3.11), version: 0.1.0 2025-12-23T02:45:10,814 Fetching project page and analyzing links: https://www.piwheels.org/simple/llm-inference-engine/ 2025-12-23T02:45:10,815 Getting page https://www.piwheels.org/simple/llm-inference-engine/ 2025-12-23T02:45:10,816 Found index url https://www.piwheels.org/simple 2025-12-23T02:45:10,967 Fetched page https://www.piwheels.org/simple/llm-inference-engine/ as text/html 2025-12-23T02:45:10,968 Skipping link: not a file: https://www.piwheels.org/simple/llm-inference-engine/ 2025-12-23T02:45:10,969 Skipping link: not a file: https://pypi.org/simple/llm-inference-engine/ 2025-12-23T02:45:10,988 Given no hashes to check 1 links for project 'llm-inference-engine': discarding no candidates 2025-12-23T02:45:11,005 Collecting llm-inference-engine==0.1.0 2025-12-23T02:45:11,008 Created temporary directory: /tmp/pip-unpack-m6dfenaq 2025-12-23T02:45:11,261 Downloading llm_inference_engine-0.1.0.tar.gz (9.0 kB) 2025-12-23T02:45:11,284 Added llm-inference-engine==0.1.0 from https://files.pythonhosted.org/packages/a2/e9/3eebe7915f5f43fdc0fcde496c8151701785895c64791bf379a9c2de9f4b/llm_inference_engine-0.1.0.tar.gz to build tracker '/tmp/pip-build-tracker-ulkcvmq3' 2025-12-23T02:45:11,288 Created temporary directory: /tmp/pip-build-env-b21w7ioy 2025-12-23T02:45:11,292 Installing build dependencies: started 2025-12-23T02:45:11,294 Running command pip subprocess to install build dependencies 2025-12-23T02:45:12,421 Using pip 23.0.1 from /usr/lib/python3/dist-packages/pip (python 3.11) 2025-12-23T02:45:13,074 DEPRECATION: --no-binary currently disables reading from the cache of locally built wheels. In the future --no-binary will not influence the wheel cache. pip 23.1 will enforce this behaviour change. A possible replacement is to use the --no-cache-dir option. You can use the flag --use-feature=no-binary-enable-wheel-cache to test the upcoming behaviour. Discussion can be found at https://github.com/pypa/pip/issues/11453 2025-12-23T02:45:13,096 Looking in indexes: https://pypi.org/simple, https://www.piwheels.org/simple 2025-12-23T02:45:13,614 Collecting poetry-core 2025-12-23T02:45:13,646 Using cached https://www.piwheels.org/simple/poetry-core/poetry_core-2.2.1-py3-none-any.whl (338 kB) 2025-12-23T02:45:16,150 Installing collected packages: poetry-core 2025-12-23T02:45:17,434 Successfully installed poetry-core-2.2.1 2025-12-23T02:45:17,663 Installing build dependencies: finished with status 'done' 2025-12-23T02:45:17,669 Getting requirements to build wheel: started 2025-12-23T02:45:17,671 Running command Getting requirements to build wheel 2025-12-23T02:45:18,132 Getting requirements to build wheel: finished with status 'done' 2025-12-23T02:45:18,135 Created temporary directory: /tmp/pip-modern-metadata-cp7z35gr 2025-12-23T02:45:18,138 Preparing metadata (pyproject.toml): started 2025-12-23T02:45:18,139 Running command Preparing metadata (pyproject.toml) 2025-12-23T02:45:18,817 Preparing metadata (pyproject.toml): finished with status 'done' 2025-12-23T02:45:18,821 Source in /tmp/pip-wheel-yezbu2j9/llm-inference-engine_4bb6b979da1841fd9f8f04e8eef89d4f has version 0.1.0, which satisfies requirement llm-inference-engine==0.1.0 from https://files.pythonhosted.org/packages/a2/e9/3eebe7915f5f43fdc0fcde496c8151701785895c64791bf379a9c2de9f4b/llm_inference_engine-0.1.0.tar.gz 2025-12-23T02:45:18,822 Removed llm-inference-engine==0.1.0 from https://files.pythonhosted.org/packages/a2/e9/3eebe7915f5f43fdc0fcde496c8151701785895c64791bf379a9c2de9f4b/llm_inference_engine-0.1.0.tar.gz from build tracker '/tmp/pip-build-tracker-ulkcvmq3' 2025-12-23T02:45:18,827 Created temporary directory: /tmp/pip-unpack-oh78jwxd 2025-12-23T02:45:18,827 Building wheels for collected packages: llm-inference-engine 2025-12-23T02:45:18,832 Created temporary directory: /tmp/pip-wheel-h72d0a7q 2025-12-23T02:45:18,832 Destination directory: /tmp/pip-wheel-h72d0a7q 2025-12-23T02:45:18,834 Building wheel for llm-inference-engine (pyproject.toml): started 2025-12-23T02:45:18,836 Running command Building wheel for llm-inference-engine (pyproject.toml) 2025-12-23T02:45:19,552 Building wheel for llm-inference-engine (pyproject.toml): finished with status 'done' 2025-12-23T02:45:19,556 Created wheel for llm-inference-engine: filename=llm_inference_engine-0.1.0-py3-none-any.whl size=10496 sha256=567bada763146e979aca8976400d6255a42006fd5456d1c9364b975ed2a5f720 2025-12-23T02:45:19,557 Stored in directory: /tmp/pip-ephem-wheel-cache-zlmn3w40/wheels/a5/58/97/de276126ff16ba87144327ab1c314d2efcf5b9dbe4fe4a759c 2025-12-23T02:45:19,566 Successfully built llm-inference-engine 2025-12-23T02:45:19,569 Removed build tracker: '/tmp/pip-build-tracker-ulkcvmq3'