Created temporary directory: /tmp/pip-ephem-wheel-cache-241tcipy Created temporary directory: /tmp/pip-req-tracker-l9h_7z96 Created requirements tracker '/tmp/pip-req-tracker-l9h_7z96' Created temporary directory: /tmp/pip-wheel-tnrrpl3q Collecting tiny-tokenizer==3.0.0 1 location(s) to search for versions of tiny-tokenizer: * https://pypi.org/simple/tiny-tokenizer/ Getting page https://pypi.org/simple/tiny-tokenizer/ Analyzing links from page https://pypi.org/simple/tiny-tokenizer/ Found link https://files.pythonhosted.org/packages/52/92/abdf9fd52dd3642159dfc4265b4e600586d879a4b9912e896c62708ddefd/tiny_tokenizer-1.0.tar.gz#sha256=313b79624c6f10dc48eb6ead2652735d51891bdbea220380f4ee5a58d47535f9 (from https://pypi.org/simple/tiny-tokenizer/), version: 1.0 Found link https://files.pythonhosted.org/packages/3e/cd/f77a28ed912de0bb20d9f8b47720898519859cada125670d1fb51046f7f9/tiny_tokenizer-1.0.1.tar.gz#sha256=ea0ea0c99f5efacbdae51c985cbfba4e06e9f44600f21c5d706183c350b9f930 (from https://pypi.org/simple/tiny-tokenizer/), version: 1.0.1 Found link https://files.pythonhosted.org/packages/26/ab/ac32385a167d1777a6e075b9f383571bd481a2a8ed076874774e193cebb5/tiny_tokenizer-1.0.2.tar.gz#sha256=1e5f45cdd037623035cc62b2a9ce268c368bfb918ea3e3c452ad04d32778b015 (from https://pypi.org/simple/tiny-tokenizer/), version: 1.0.2 Found link https://files.pythonhosted.org/packages/f8/54/120814cbb0935e8d06c48fe0f2d134c865cc41b485bc3d2687b508a54202/tiny_tokenizer-1.0.3.tar.gz#sha256=421d0b4b006eb9667b734bc4d49cf5dcfe8c94aaa1b833cf84df10bc8ccd5ee8 (from https://pypi.org/simple/tiny-tokenizer/), version: 1.0.3 Found link https://files.pythonhosted.org/packages/51/19/91ea3fe860400ac0470a298c793ba318a7ceb4e3295f0f58833e8b586e91/tiny_tokenizer-1.0.4.tar.gz#sha256=a286b89e4f09adc6c456b90d8bd6ba2ff62dd30003d67b31b26af8cfef023214 (from https://pypi.org/simple/tiny-tokenizer/), version: 1.0.4 Found link https://files.pythonhosted.org/packages/9b/2a/712c28cd7daf368c713161f32a6ec426aaf2c314311fea97a4f78dd7df0d/tiny_tokenizer-1.1.0.tar.gz#sha256=f6567ccebdfc20778776c40f0f07bfa25c6d963a35e5ca83112a1c7bb5d74231 (from https://pypi.org/simple/tiny-tokenizer/), version: 1.1.0 Found link https://files.pythonhosted.org/packages/dc/84/9feb0f84ff5aa183733f9fc00e8d0f7d432a5276e2587fbed5114bd8167d/tiny_tokenizer-1.2.0.tar.gz#sha256=c51b37bd7ed25432f5ec5707b9b3bf64825c1a8fb847e6b5ca2ca35d3a1928af (from https://pypi.org/simple/tiny-tokenizer/), version: 1.2.0 Found link https://files.pythonhosted.org/packages/74/27/9bc6eea6d6d840afc8db5ece7a872ec7e0380c3d65c075efa746aee18d55/tiny_tokenizer-1.2.1.tar.gz#sha256=957096a8d2d4b28c149304137bd9918a05057d2dc27d54938ead837e8f778a00 (from https://pypi.org/simple/tiny-tokenizer/), version: 1.2.1 Found link https://files.pythonhosted.org/packages/60/9f/7033b9ac3dceb01b722623a2aa48fd0634bc4f16429a2d3e6f840bcd8862/tiny_tokenizer-1.3.0.tar.gz#sha256=ac778277c621a446eebba0c84e41ececeb1487c2a356550ad0cfd7b099631fa8 (from https://pypi.org/simple/tiny-tokenizer/), version: 1.3.0 Found link https://files.pythonhosted.org/packages/b8/75/57fc79f9f0232ae80bd351231996b72946d6b9f071c66f35fb58eae30537/tiny_tokenizer-1.3.1.tar.gz#sha256=45aeec99b04fb8f3d81b9e536fea7441a396df487530a6762311fda377feb2c2 (from https://pypi.org/simple/tiny-tokenizer/), version: 1.3.1 Found link https://files.pythonhosted.org/packages/8b/98/da3f8ca178bb5f6ec562b4f8330be4dd65296b4fc2df9ba854b5929f432e/tiny_tokenizer-2.0.0.tar.gz#sha256=066c125a629f1d5290930fd0b107e12f74c1c084c4154013c18f86401badf1de (from https://pypi.org/simple/tiny-tokenizer/), version: 2.0.0 Found link https://files.pythonhosted.org/packages/a6/41/250f027f5ed9a6706a1839eef43911961f11b5adac03af57917817b394d2/tiny_tokenizer-2.1.0.tar.gz#sha256=582d2cebc0e17dcf7cb8783e3be87e5ad40390f0da538ac0aa491655bbefc727 (from https://pypi.org/simple/tiny-tokenizer/), version: 2.1.0 Found link https://files.pythonhosted.org/packages/ec/d4/036172ce54320d939b78988f1b71656bf9973be0f9fa32fa0edb3bd3b711/tiny_tokenizer-3.0.0.tar.gz#sha256=5b2df03fb6a3c8e33348263960000455f91b8b7bb40007b88eb894898f3ea17f (from https://pypi.org/simple/tiny-tokenizer/), version: 3.0.0 Using version 3.0.0 (newest of versions: 3.0.0) Created temporary directory: /tmp/pip-unpack-rlunh4u0 Downloading https://files.pythonhosted.org/packages/ec/d4/036172ce54320d939b78988f1b71656bf9973be0f9fa32fa0edb3bd3b711/tiny_tokenizer-3.0.0.tar.gz Downloading from URL https://files.pythonhosted.org/packages/ec/d4/036172ce54320d939b78988f1b71656bf9973be0f9fa32fa0edb3bd3b711/tiny_tokenizer-3.0.0.tar.gz#sha256=5b2df03fb6a3c8e33348263960000455f91b8b7bb40007b88eb894898f3ea17f (from https://pypi.org/simple/tiny-tokenizer/) Added tiny-tokenizer==3.0.0 from https://files.pythonhosted.org/packages/ec/d4/036172ce54320d939b78988f1b71656bf9973be0f9fa32fa0edb3bd3b711/tiny_tokenizer-3.0.0.tar.gz#sha256=5b2df03fb6a3c8e33348263960000455f91b8b7bb40007b88eb894898f3ea17f to build tracker '/tmp/pip-req-tracker-l9h_7z96' Running setup.py (path:/tmp/pip-wheel-tnrrpl3q/tiny-tokenizer/setup.py) egg_info for package tiny-tokenizer Running command python setup.py egg_info running egg_info creating pip-egg-info/tiny_tokenizer.egg-info writing pip-egg-info/tiny_tokenizer.egg-info/PKG-INFO writing requirements to pip-egg-info/tiny_tokenizer.egg-info/requires.txt writing dependency_links to pip-egg-info/tiny_tokenizer.egg-info/dependency_links.txt writing top-level names to pip-egg-info/tiny_tokenizer.egg-info/top_level.txt writing manifest file 'pip-egg-info/tiny_tokenizer.egg-info/SOURCES.txt' reading manifest file 'pip-egg-info/tiny_tokenizer.egg-info/SOURCES.txt' writing manifest file 'pip-egg-info/tiny_tokenizer.egg-info/SOURCES.txt' Source in /tmp/pip-wheel-tnrrpl3q/tiny-tokenizer has version 3.0.0, which satisfies requirement tiny-tokenizer==3.0.0 from https://files.pythonhosted.org/packages/ec/d4/036172ce54320d939b78988f1b71656bf9973be0f9fa32fa0edb3bd3b711/tiny_tokenizer-3.0.0.tar.gz#sha256=5b2df03fb6a3c8e33348263960000455f91b8b7bb40007b88eb894898f3ea17f Removed tiny-tokenizer==3.0.0 from https://files.pythonhosted.org/packages/ec/d4/036172ce54320d939b78988f1b71656bf9973be0f9fa32fa0edb3bd3b711/tiny_tokenizer-3.0.0.tar.gz#sha256=5b2df03fb6a3c8e33348263960000455f91b8b7bb40007b88eb894898f3ea17f from build tracker '/tmp/pip-req-tracker-l9h_7z96' Building wheels for collected packages: tiny-tokenizer Created temporary directory: /tmp/pip-wheel-jm4xdryt Running setup.py bdist_wheel for tiny-tokenizer: started Destination directory: /tmp/pip-wheel-jm4xdryt Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-wheel-tnrrpl3q/tiny-tokenizer/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-jm4xdryt running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/tiny_tokenizer copying tiny_tokenizer/sentence_tokenizer.py -> build/lib/tiny_tokenizer copying tiny_tokenizer/word_tokenizer.py -> build/lib/tiny_tokenizer copying tiny_tokenizer/tiny_tokenizer_token.py -> build/lib/tiny_tokenizer copying tiny_tokenizer/__init__.py -> build/lib/tiny_tokenizer creating build/lib/tiny_tokenizer/word_tokenizers copying tiny_tokenizer/word_tokenizers/sentencepiece_tokenizer.py -> build/lib/tiny_tokenizer/word_tokenizers copying tiny_tokenizer/word_tokenizers/tokenizer.py -> build/lib/tiny_tokenizer/word_tokenizers copying tiny_tokenizer/word_tokenizers/__init__.py -> build/lib/tiny_tokenizer/word_tokenizers copying tiny_tokenizer/word_tokenizers/kytea_tokenizer.py -> build/lib/tiny_tokenizer/word_tokenizers copying tiny_tokenizer/word_tokenizers/sudachi_tokenizer.py -> build/lib/tiny_tokenizer/word_tokenizers copying tiny_tokenizer/word_tokenizers/character_tokenizer.py -> build/lib/tiny_tokenizer/word_tokenizers copying tiny_tokenizer/word_tokenizers/mecab_tokenizer.py -> build/lib/tiny_tokenizer/word_tokenizers installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/tiny_tokenizer copying build/lib/tiny_tokenizer/sentence_tokenizer.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer copying build/lib/tiny_tokenizer/word_tokenizer.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer copying build/lib/tiny_tokenizer/tiny_tokenizer_token.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer creating build/bdist.linux-armv7l/wheel/tiny_tokenizer/word_tokenizers copying build/lib/tiny_tokenizer/word_tokenizers/sentencepiece_tokenizer.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer/word_tokenizers copying build/lib/tiny_tokenizer/word_tokenizers/tokenizer.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer/word_tokenizers copying build/lib/tiny_tokenizer/word_tokenizers/__init__.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer/word_tokenizers copying build/lib/tiny_tokenizer/word_tokenizers/kytea_tokenizer.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer/word_tokenizers copying build/lib/tiny_tokenizer/word_tokenizers/sudachi_tokenizer.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer/word_tokenizers copying build/lib/tiny_tokenizer/word_tokenizers/character_tokenizer.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer/word_tokenizers copying build/lib/tiny_tokenizer/word_tokenizers/mecab_tokenizer.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer/word_tokenizers copying build/lib/tiny_tokenizer/__init__.py -> build/bdist.linux-armv7l/wheel/tiny_tokenizer running install_egg_info running egg_info writing tiny_tokenizer.egg-info/PKG-INFO writing requirements to tiny_tokenizer.egg-info/requires.txt writing top-level names to tiny_tokenizer.egg-info/top_level.txt writing dependency_links to tiny_tokenizer.egg-info/dependency_links.txt reading manifest file 'tiny_tokenizer.egg-info/SOURCES.txt' writing manifest file 'tiny_tokenizer.egg-info/SOURCES.txt' Copying tiny_tokenizer.egg-info to build/bdist.linux-armv7l/wheel/tiny_tokenizer-3.0.0-py3.4.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/tiny_tokenizer-3.0.0.dist-info/WHEEL creating '/tmp/pip-wheel-jm4xdryt/tiny_tokenizer-3.0.0-py3-none-any.whl' and adding 'build/bdist.linux-armv7l/wheel' to it adding 'tiny_tokenizer/__init__.py' adding 'tiny_tokenizer/sentence_tokenizer.py' adding 'tiny_tokenizer/tiny_tokenizer_token.py' adding 'tiny_tokenizer/word_tokenizer.py' adding 'tiny_tokenizer/word_tokenizers/__init__.py' adding 'tiny_tokenizer/word_tokenizers/character_tokenizer.py' adding 'tiny_tokenizer/word_tokenizers/kytea_tokenizer.py' adding 'tiny_tokenizer/word_tokenizers/mecab_tokenizer.py' adding 'tiny_tokenizer/word_tokenizers/sentencepiece_tokenizer.py' adding 'tiny_tokenizer/word_tokenizers/sudachi_tokenizer.py' adding 'tiny_tokenizer/word_tokenizers/tokenizer.py' adding 'tiny_tokenizer-3.0.0.dist-info/METADATA' adding 'tiny_tokenizer-3.0.0.dist-info/WHEEL' adding 'tiny_tokenizer-3.0.0.dist-info/top_level.txt' adding 'tiny_tokenizer-3.0.0.dist-info/RECORD' removing build/bdist.linux-armv7l/wheel Running setup.py bdist_wheel for tiny-tokenizer: finished with status 'done' Stored in directory: /tmp/tmpqjxqh13o Successfully built tiny-tokenizer Cleaning up... Removing source in /tmp/pip-wheel-tnrrpl3q/tiny-tokenizer Removed build tracker '/tmp/pip-req-tracker-l9h_7z96'