Collecting JapaneseTokenizer==1.3.5 1 location(s) to search for versions of JapaneseTokenizer: * https://pypi.python.org/simple/japanesetokenizer/ Getting page https://pypi.python.org/simple/japanesetokenizer/ Analyzing links from page https://pypi.python.org/simple/japanesetokenizer/ Found link https://pypi.python.org/packages/08/ed/e970d4d49554e7ddf5ed3edec1e5f1d8265c47e44c6de0b4e71191a15c7d/JapaneseTokenizer-1.2.7.tar.gz#md5=4caa2f2fd7e0a4e71d0d95b319e2f960 (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.2.7 Found link https://pypi.python.org/packages/0c/b7/c54719202bbd67b7a44ba1d53f58d5557261d4a523fe1bfe1aeceab9ca65/JapaneseTokenizer-1.3.5.tar.gz#md5=e24bafd9da63971195ce7f1590e7487c (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.3.5 Found link https://pypi.python.org/packages/13/61/6586dc1b41cdcf05f1f49ea3d92cc4e14341c7879dcce33fab693383da25/JapaneseTokenizer-0.8.tar.gz#md5=aecb2a18cabf7ab5fcae6dd5147390e8 (from https://pypi.python.org/simple/japanesetokenizer/), version: 0.8 Found link https://pypi.python.org/packages/2a/26/0c91f48b42d8059f969c2ab9a56b29b351eac15a139744f17fb98fa0accb/JapaneseTokenizer-1.2.5.tar.gz#md5=30d57d0c20f73a976f3be343738f5f55 (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.2.5 Found link https://pypi.python.org/packages/54/b5/79590c298941926efd0b4fbe4906979bfcb16199dd371e46d2a88144b8c4/JapaneseTokenizer-1.0.tar.gz#md5=2b10f181b00cc626eed106965163d8e3 (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.0 Found link https://pypi.python.org/packages/5b/96/cc92357c7e7261c791db6f769bda1cd2c1f4547ea64bd1b75227c2818643/JapaneseTokenizer-0.6a1.tar.gz#md5=e55f95650df77ef126ee23b2cc88ad20 (from https://pypi.python.org/simple/japanesetokenizer/), version: 0.6a1 Found link https://pypi.python.org/packages/79/19/c950a7b3ba817a9fc56e9d44d6698a401e876f54de404a760f8a1100a34a/JapaneseTokenizer-1.3.0.tar.gz#md5=13b4072229b14619ae7b8cd672c47fbe (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.3.0 Found link https://pypi.python.org/packages/82/47/8727b3a859e6c3cff4c3fdb2a5fc068d355bd346a22db38730e3ae9bd1ce/JapaneseTokenizer-1.0a0.tar.gz#md5=87fa79c3609690d6b86295c494af7d6f (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.0a0 Found link https://pypi.python.org/packages/87/2b/6ad05fb2afe606550828fba2c1da98629bf2f34a5523535d7214ea9c15c0/JapaneseTokenizer-1.2.3.tar.gz#md5=da97ce49089ebc76f08aa476a9c1cdce (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.2.3 Found link https://pypi.python.org/packages/ab/73/cfc8a35e964e78dbc8a72cd631b4f84d837279ae34501381a115d83f6c58/JapaneseTokenizer-1.3.4.tar.gz#md5=29140724b608a9a2b8232c7c6bfb35d8 (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.3.4 Found link https://pypi.python.org/packages/b4/1c/9939737367a9fbd76d83fb3785676f1c25f441e73612e2b8ef7cd0f96ca4/JapaneseTokenizer-1.3.3.tar.gz#md5=3c19ae9d41dc190c59be3664e3c9b659 (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.3.3 Found link https://pypi.python.org/packages/b6/b2/089232b3c76d3e802e0fd1c17eb7d96f9bbdc4034d8a55b3bfa18d241d64/JapaneseTokenizer-1.0b0.tar.gz#md5=b6d953a9ed3efe28dfbc1a5c9f3287a2 (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.0b0 Found link https://pypi.python.org/packages/c0/06/e6629d1113b7f9061702969154baed98cdb75d3648ff3d0270d87b17a7a5/JapaneseTokenizer-1.3.1.tar.gz#md5=f70e1e0d82111714c1958084eef1c813 (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.3.1 Found link https://pypi.python.org/packages/c6/7a/e5b1c02a0e1c055d17f171fe04925556f419b59f871902a77f6e83a048af/JapaneseTokenizer-0.7.tar.gz#md5=bdba410c246604b53599e6431ee8ba97 (from https://pypi.python.org/simple/japanesetokenizer/), version: 0.7 Found link https://pypi.python.org/packages/d5/6a/e52ba521d54169b40acf7cb025b68e1e7119e28da6be9416c99054111bcd/JapaneseTokenizer-0.9.tar.gz#md5=965d1893a35b65ed9c81d09492834f3b (from https://pypi.python.org/simple/japanesetokenizer/), version: 0.9 Found link https://pypi.python.org/packages/eb/9c/9197907e3cd0c13f03cd887efef234bc0fe359a05a844e7b82405ddab282/JapaneseTokenizer-1.2.6.tar.gz#md5=e6389d1aa28127631fb9595e19a6fd23 (from https://pypi.python.org/simple/japanesetokenizer/), version: 1.2.6 Using version 1.3.5 (newest of versions: 1.3.5) Downloading JapaneseTokenizer-1.3.5.tar.gz Downloading from URL https://pypi.python.org/packages/0c/b7/c54719202bbd67b7a44ba1d53f58d5557261d4a523fe1bfe1aeceab9ca65/JapaneseTokenizer-1.3.5.tar.gz#md5=e24bafd9da63971195ce7f1590e7487c (from https://pypi.python.org/simple/japanesetokenizer/) Running setup.py (path:/tmp/pip-build-md0x29rt/JapaneseTokenizer/setup.py) egg_info for package JapaneseTokenizer Running command python setup.py egg_info Collecting kytea Using cached kytea-0.1.3.tar.gz Building wheels for collected packages: kytea Running setup.py bdist_wheel for kytea: started Running setup.py bdist_wheel for kytea: finished with status 'error' Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-m03f8at2/kytea/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/tmp144xe508pip-wheel- --python-tag cp34: /usr/lib/python3.4/distutils/dist.py:260: UserWarning: Unknown distribution option: 'classifier' warnings.warn(msg) running bdist_wheel running build running build_py creating build creating build/lib.linux-armv7l-3.4 copying lib/kytea/Mykytea.py -> build/lib.linux-armv7l-3.4 running build_ext building '_Mykytea' extension creating build/temp.linux-armv7l-3.4 creating build/temp.linux-armv7l-3.4/lib creating build/temp.linux-armv7l-3.4/lib/kytea arm-linux-gnueabihf-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -g -fstack-protector-strong -Wformat -Werror=format-security -D_FORTIFY_SOURCE=2 -fPIC -Ilib/kytea -I/usr/include/python3.4m -c lib/kytea/mykytea_wrap.cxx -o build/temp.linux-armv7l-3.4/lib/kytea/mykytea_wrap.o cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ In file included from lib/kytea/mykytea_wrap.cxx:3534:0: lib/kytea/mykytea.hpp:10:25: fatal error: kytea/kytea.h: No such file or directory #include ^ compilation terminated. error: command 'arm-linux-gnueabihf-gcc' failed with exit status 1 ---------------------------------------- Failed building wheel for kytea Running setup.py clean for kytea Failed to build kytea Installing collected packages: kytea Running setup.py install for kytea: started Running setup.py install for kytea: finished with status 'error' Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-m03f8at2/kytea/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-7szagjar-record/install-record.txt --single-version-externally-managed --compile: /usr/lib/python3.4/distutils/dist.py:260: UserWarning: Unknown distribution option: 'classifier' warnings.warn(msg) running install running build running build_py creating build creating build/lib.linux-armv7l-3.4 copying lib/kytea/Mykytea.py -> build/lib.linux-armv7l-3.4 running build_ext building '_Mykytea' extension creating build/temp.linux-armv7l-3.4 creating build/temp.linux-armv7l-3.4/lib creating build/temp.linux-armv7l-3.4/lib/kytea arm-linux-gnueabihf-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -g -fstack-protector-strong -Wformat -Werror=format-security -D_FORTIFY_SOURCE=2 -fPIC -Ilib/kytea -I/usr/include/python3.4m -c lib/kytea/mykytea_wrap.cxx -o build/temp.linux-armv7l-3.4/lib/kytea/mykytea_wrap.o cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ In file included from lib/kytea/mykytea_wrap.cxx:3534:0: lib/kytea/mykytea.hpp:10:25: fatal error: kytea/kytea.h: No such file or directory #include ^ compilation terminated. error: command 'arm-linux-gnueabihf-gcc' failed with exit status 1 ---------------------------------------- Command "/usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-m03f8at2/kytea/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-7szagjar-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-m03f8at2/kytea/ We failed to install mykytea automatically. Try installing kytea manually. Collecting http://nlp.ist.i.kyoto-u.ac.jp/DLcounter/lime.cgi?down=http://lotus.kuee.kyoto-u.ac.jp/nl-resource/pyknp/pyknp-0.3.tar.gz&name=pyknp-0.3.tar.gz Downloading http://nlp.ist.i.kyoto-u.ac.jp/DLcounter/lime.cgi?down=http://lotus.kuee.kyoto-u.ac.jp/nl-resource/pyknp/pyknp-0.3.tar.gz&name=pyknp-0.3.tar.gz Cannot determine compression type for file /tmp/pip-noh5xb37-unpack/lime.cgi Installing collected packages: pyknp Running setup.py install for pyknp: started Running setup.py install for pyknp: finished with status 'error' Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-w667af87-build/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-z0ynv4_3-record/install-record.txt --single-version-externally-managed --compile: running install running build running build_py creating build creating build/lib creating build/lib/pyknp copying pyknp/__init__.py -> build/lib/pyknp creating build/lib/pyknp/evaluate copying pyknp/evaluate/mrph.py -> build/lib/pyknp/evaluate copying pyknp/evaluate/phrase.py -> build/lib/pyknp/evaluate copying pyknp/evaluate/dep.py -> build/lib/pyknp/evaluate copying pyknp/evaluate/__init__.py -> build/lib/pyknp/evaluate copying pyknp/evaluate/scorer.py -> build/lib/pyknp/evaluate creating build/lib/pyknp/knp copying pyknp/knp/drawtree.py -> build/lib/pyknp/knp copying pyknp/knp/bunsetsu.py -> build/lib/pyknp/knp copying pyknp/knp/blist.py -> build/lib/pyknp/knp copying pyknp/knp/tag.py -> build/lib/pyknp/knp copying pyknp/knp/rel.py -> build/lib/pyknp/knp copying pyknp/knp/knp.py -> build/lib/pyknp/knp copying pyknp/knp/tlist.py -> build/lib/pyknp/knp copying pyknp/knp/syngraph.py -> build/lib/pyknp/knp copying pyknp/knp/__init__.py -> build/lib/pyknp/knp copying pyknp/knp/pas.py -> build/lib/pyknp/knp copying pyknp/knp/features.py -> build/lib/pyknp/knp copying pyknp/knp/simple.py -> build/lib/pyknp/knp creating build/lib/pyknp/juman copying pyknp/juman/juman.py -> build/lib/pyknp/juman copying pyknp/juman/jumanpp.py -> build/lib/pyknp/juman copying pyknp/juman/morpheme.py -> build/lib/pyknp/juman copying pyknp/juman/__init__.py -> build/lib/pyknp/juman copying pyknp/juman/mlist.py -> build/lib/pyknp/juman copying pyknp/juman/simple.py -> build/lib/pyknp/juman running build_scripts creating build/scripts-3.4 copying and adjusting pyknp/scripts/knp-drawtree -> build/scripts-3.4 changing mode of build/scripts-3.4/knp-drawtree from 644 to 755 running install_lib creating /usr/local/lib/python3.4/dist-packages/pyknp error: could not create '/usr/local/lib/python3.4/dist-packages/pyknp': Permission denied ---------------------------------------- Command "/usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-w667af87-build/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-z0ynv4_3-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-w667af87-build/ We failed to install pyknp automatically. Try installing pyknp manually. python=sys.version_info(major=3, minor=4, micro=2, releaselevel='final', serial=0) running egg_info creating pip-egg-info/JapaneseTokenizer.egg-info writing dependency_links to pip-egg-info/JapaneseTokenizer.egg-info/dependency_links.txt writing top-level names to pip-egg-info/JapaneseTokenizer.egg-info/top_level.txt writing requirements to pip-egg-info/JapaneseTokenizer.egg-info/requires.txt writing pip-egg-info/JapaneseTokenizer.egg-info/PKG-INFO writing manifest file 'pip-egg-info/JapaneseTokenizer.egg-info/SOURCES.txt' Generating grammar tables from /usr/lib/python3.4/lib2to3/Grammar.txt Generating grammar tables from /usr/lib/python3.4/lib2to3/PatternGrammar.txt reading manifest file 'pip-egg-info/JapaneseTokenizer.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' Maybe try: sudo apt-get install pandoc See http://johnmacfarlane.net/pandoc/installing.html for installation options --------------------------------------------------------------- /usr/lib/python3.4/distutils/dist.py:260: UserWarning: Unknown distribution option: 'short_description' warnings.warn(msg) warning: no files found matching 'README_JP.md' warning: no files found matching 'examples' warning: no files found matching 'test' writing manifest file 'pip-egg-info/JapaneseTokenizer.egg-info/SOURCES.txt' Source in /tmp/pip-build-md0x29rt/JapaneseTokenizer has version 1.3.5, which satisfies requirement JapaneseTokenizer==1.3.5 from https://pypi.python.org/packages/0c/b7/c54719202bbd67b7a44ba1d53f58d5557261d4a523fe1bfe1aeceab9ca65/JapaneseTokenizer-1.3.5.tar.gz#md5=e24bafd9da63971195ce7f1590e7487c Building wheels for collected packages: JapaneseTokenizer Running setup.py bdist_wheel for JapaneseTokenizer: started Destination directory: /tmp/tmp5q88o_iipip-wheel- Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-md0x29rt/JapaneseTokenizer/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/tmp5q88o_iipip-wheel- Collecting kytea Using cached kytea-0.1.3.tar.gz Building wheels for collected packages: kytea Running setup.py bdist_wheel for kytea: started Running setup.py bdist_wheel for kytea: finished with status 'error' Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-lftaj9eb/kytea/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/tmp3qfavw4lpip-wheel- --python-tag cp34: /usr/lib/python3.4/distutils/dist.py:260: UserWarning: Unknown distribution option: 'classifier' warnings.warn(msg) running bdist_wheel running build running build_py creating build creating build/lib.linux-armv7l-3.4 copying lib/kytea/Mykytea.py -> build/lib.linux-armv7l-3.4 running build_ext building '_Mykytea' extension creating build/temp.linux-armv7l-3.4 creating build/temp.linux-armv7l-3.4/lib creating build/temp.linux-armv7l-3.4/lib/kytea arm-linux-gnueabihf-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -g -fstack-protector-strong -Wformat -Werror=format-security -D_FORTIFY_SOURCE=2 -fPIC -Ilib/kytea -I/usr/include/python3.4m -c lib/kytea/mykytea_wrap.cxx -o build/temp.linux-armv7l-3.4/lib/kytea/mykytea_wrap.o cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ In file included from lib/kytea/mykytea_wrap.cxx:3534:0: lib/kytea/mykytea.hpp:10:25: fatal error: kytea/kytea.h: No such file or directory #include ^ compilation terminated. error: command 'arm-linux-gnueabihf-gcc' failed with exit status 1 ---------------------------------------- Failed building wheel for kytea Running setup.py clean for kytea Failed to build kytea Installing collected packages: kytea Running setup.py install for kytea: started Running setup.py install for kytea: finished with status 'error' Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-lftaj9eb/kytea/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-1na0tlj5-record/install-record.txt --single-version-externally-managed --compile: /usr/lib/python3.4/distutils/dist.py:260: UserWarning: Unknown distribution option: 'classifier' warnings.warn(msg) running install running build running build_py creating build creating build/lib.linux-armv7l-3.4 copying lib/kytea/Mykytea.py -> build/lib.linux-armv7l-3.4 running build_ext building '_Mykytea' extension creating build/temp.linux-armv7l-3.4 creating build/temp.linux-armv7l-3.4/lib creating build/temp.linux-armv7l-3.4/lib/kytea arm-linux-gnueabihf-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -g -fstack-protector-strong -Wformat -Werror=format-security -D_FORTIFY_SOURCE=2 -fPIC -Ilib/kytea -I/usr/include/python3.4m -c lib/kytea/mykytea_wrap.cxx -o build/temp.linux-armv7l-3.4/lib/kytea/mykytea_wrap.o cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ In file included from lib/kytea/mykytea_wrap.cxx:3534:0: lib/kytea/mykytea.hpp:10:25: fatal error: kytea/kytea.h: No such file or directory #include ^ compilation terminated. error: command 'arm-linux-gnueabihf-gcc' failed with exit status 1 ---------------------------------------- Command "/usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-lftaj9eb/kytea/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-1na0tlj5-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-lftaj9eb/kytea/ We failed to install mykytea automatically. Try installing kytea manually. Collecting http://nlp.ist.i.kyoto-u.ac.jp/DLcounter/lime.cgi?down=http://lotus.kuee.kyoto-u.ac.jp/nl-resource/pyknp/pyknp-0.3.tar.gz&name=pyknp-0.3.tar.gz Downloading http://nlp.ist.i.kyoto-u.ac.jp/DLcounter/lime.cgi?down=http://lotus.kuee.kyoto-u.ac.jp/nl-resource/pyknp/pyknp-0.3.tar.gz&name=pyknp-0.3.tar.gz Cannot determine compression type for file /tmp/pip-ct5gnrja-unpack/lime.cgi Installing collected packages: pyknp Running setup.py install for pyknp: started Running setup.py install for pyknp: finished with status 'error' Complete output from command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-kt61k9r0-build/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-_obgoz3z-record/install-record.txt --single-version-externally-managed --compile: running install running build running build_py creating build creating build/lib creating build/lib/pyknp copying pyknp/__init__.py -> build/lib/pyknp creating build/lib/pyknp/evaluate copying pyknp/evaluate/mrph.py -> build/lib/pyknp/evaluate copying pyknp/evaluate/phrase.py -> build/lib/pyknp/evaluate copying pyknp/evaluate/dep.py -> build/lib/pyknp/evaluate copying pyknp/evaluate/__init__.py -> build/lib/pyknp/evaluate copying pyknp/evaluate/scorer.py -> build/lib/pyknp/evaluate creating build/lib/pyknp/knp copying pyknp/knp/drawtree.py -> build/lib/pyknp/knp copying pyknp/knp/bunsetsu.py -> build/lib/pyknp/knp copying pyknp/knp/blist.py -> build/lib/pyknp/knp copying pyknp/knp/tag.py -> build/lib/pyknp/knp copying pyknp/knp/rel.py -> build/lib/pyknp/knp copying pyknp/knp/knp.py -> build/lib/pyknp/knp copying pyknp/knp/tlist.py -> build/lib/pyknp/knp copying pyknp/knp/syngraph.py -> build/lib/pyknp/knp copying pyknp/knp/__init__.py -> build/lib/pyknp/knp copying pyknp/knp/pas.py -> build/lib/pyknp/knp copying pyknp/knp/features.py -> build/lib/pyknp/knp copying pyknp/knp/simple.py -> build/lib/pyknp/knp creating build/lib/pyknp/juman copying pyknp/juman/juman.py -> build/lib/pyknp/juman copying pyknp/juman/jumanpp.py -> build/lib/pyknp/juman copying pyknp/juman/morpheme.py -> build/lib/pyknp/juman copying pyknp/juman/__init__.py -> build/lib/pyknp/juman copying pyknp/juman/mlist.py -> build/lib/pyknp/juman copying pyknp/juman/simple.py -> build/lib/pyknp/juman running build_scripts creating build/scripts-3.4 copying and adjusting pyknp/scripts/knp-drawtree -> build/scripts-3.4 changing mode of build/scripts-3.4/knp-drawtree from 644 to 755 running install_lib creating /usr/local/lib/python3.4/dist-packages/pyknp error: could not create '/usr/local/lib/python3.4/dist-packages/pyknp': Permission denied ---------------------------------------- Command "/usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-kt61k9r0-build/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /tmp/pip-_obgoz3z-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-kt61k9r0-build/ We failed to install pyknp automatically. Try installing pyknp manually. python=sys.version_info(major=3, minor=4, micro=2, releaselevel='final', serial=0) Maybe try: sudo apt-get install pandoc See http://johnmacfarlane.net/pandoc/installing.html for installation options --------------------------------------------------------------- /usr/lib/python3.4/distutils/dist.py:260: UserWarning: Unknown distribution option: 'short_description' warnings.warn(msg) running bdist_wheel running build running build_py Generating grammar tables from /usr/lib/python3.4/lib2to3/Grammar.txt Generating grammar tables from /usr/lib/python3.4/lib2to3/PatternGrammar.txt creating build creating build/lib creating build/lib/JapaneseTokenizer copying JapaneseTokenizer/object_models.py -> build/lib/JapaneseTokenizer copying JapaneseTokenizer/datamodels.py -> build/lib/JapaneseTokenizer copying JapaneseTokenizer/init_logger.py -> build/lib/JapaneseTokenizer copying JapaneseTokenizer/__init__.py -> build/lib/JapaneseTokenizer creating build/lib/test copying test/test_juman_wrapper_python2.py -> build/lib/test copying test/test_mecab_wrapper_python2.py -> build/lib/test copying test/test_filter_python2.py -> build/lib/test copying test/test_filter_python3.py -> build/lib/test copying test/test_jumanpp_wrapper_python2.py -> build/lib/test copying test/test_mecab_wrapper_python3.py -> build/lib/test copying test/test_kytea_wrapper_python2.py -> build/lib/test copying test/test_all.py -> build/lib/test copying test/test_jumanpp_wrapper_python3.py -> build/lib/test copying test/__init__.py -> build/lib/test copying test/test_kytea_wrapper_python3.py -> build/lib/test copying test/test_juman_wrapper_python3.py -> build/lib/test creating build/lib/JapaneseTokenizer/jumanpp_wrapper copying JapaneseTokenizer/jumanpp_wrapper/jumanpp_wrapper.py -> build/lib/JapaneseTokenizer/jumanpp_wrapper copying JapaneseTokenizer/jumanpp_wrapper/__init__.py -> build/lib/JapaneseTokenizer/jumanpp_wrapper creating build/lib/JapaneseTokenizer/kytea_wrapper copying JapaneseTokenizer/kytea_wrapper/kytea_wrapper.py -> build/lib/JapaneseTokenizer/kytea_wrapper copying JapaneseTokenizer/kytea_wrapper/__init__.py -> build/lib/JapaneseTokenizer/kytea_wrapper creating build/lib/JapaneseTokenizer/common copying JapaneseTokenizer/common/sever_handler.py -> build/lib/JapaneseTokenizer/common copying JapaneseTokenizer/common/text_preprocess.py -> build/lib/JapaneseTokenizer/common copying JapaneseTokenizer/common/timeout_handler.py -> build/lib/JapaneseTokenizer/common copying JapaneseTokenizer/common/juman_utils.py -> build/lib/JapaneseTokenizer/common copying JapaneseTokenizer/common/__init__.py -> build/lib/JapaneseTokenizer/common creating build/lib/JapaneseTokenizer/mecab_wrapper copying JapaneseTokenizer/mecab_wrapper/mecab_wrapper.py -> build/lib/JapaneseTokenizer/mecab_wrapper copying JapaneseTokenizer/mecab_wrapper/__init__.py -> build/lib/JapaneseTokenizer/mecab_wrapper creating build/lib/JapaneseTokenizer/juman_wrapper copying JapaneseTokenizer/juman_wrapper/juman_wrapper.py -> build/lib/JapaneseTokenizer/juman_wrapper copying JapaneseTokenizer/juman_wrapper/__init__.py -> build/lib/JapaneseTokenizer/juman_wrapper creating build/lib/test/common copying test/common/test_server_handler.py -> build/lib/test/common copying test/common/__init__.py -> build/lib/test/common installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/JapaneseTokenizer copying build/lib/JapaneseTokenizer/object_models.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer creating build/bdist.linux-armv7l/wheel/JapaneseTokenizer/jumanpp_wrapper copying build/lib/JapaneseTokenizer/jumanpp_wrapper/jumanpp_wrapper.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/jumanpp_wrapper copying build/lib/JapaneseTokenizer/jumanpp_wrapper/__init__.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/jumanpp_wrapper copying build/lib/JapaneseTokenizer/datamodels.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer creating build/bdist.linux-armv7l/wheel/JapaneseTokenizer/kytea_wrapper copying build/lib/JapaneseTokenizer/kytea_wrapper/kytea_wrapper.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/kytea_wrapper copying build/lib/JapaneseTokenizer/kytea_wrapper/__init__.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/kytea_wrapper copying build/lib/JapaneseTokenizer/init_logger.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer copying build/lib/JapaneseTokenizer/__init__.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer creating build/bdist.linux-armv7l/wheel/JapaneseTokenizer/common copying build/lib/JapaneseTokenizer/common/sever_handler.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/common copying build/lib/JapaneseTokenizer/common/text_preprocess.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/common copying build/lib/JapaneseTokenizer/common/timeout_handler.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/common copying build/lib/JapaneseTokenizer/common/juman_utils.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/common copying build/lib/JapaneseTokenizer/common/__init__.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/common creating build/bdist.linux-armv7l/wheel/JapaneseTokenizer/mecab_wrapper copying build/lib/JapaneseTokenizer/mecab_wrapper/mecab_wrapper.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/mecab_wrapper copying build/lib/JapaneseTokenizer/mecab_wrapper/__init__.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/mecab_wrapper creating build/bdist.linux-armv7l/wheel/JapaneseTokenizer/juman_wrapper copying build/lib/JapaneseTokenizer/juman_wrapper/juman_wrapper.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/juman_wrapper copying build/lib/JapaneseTokenizer/juman_wrapper/__init__.py -> build/bdist.linux-armv7l/wheel/JapaneseTokenizer/juman_wrapper creating build/bdist.linux-armv7l/wheel/test copying build/lib/test/test_juman_wrapper_python2.py -> build/bdist.linux-armv7l/wheel/test copying build/lib/test/test_mecab_wrapper_python2.py -> build/bdist.linux-armv7l/wheel/test copying build/lib/test/test_filter_python2.py -> build/bdist.linux-armv7l/wheel/test copying build/lib/test/test_filter_python3.py -> build/bdist.linux-armv7l/wheel/test copying build/lib/test/test_jumanpp_wrapper_python2.py -> build/bdist.linux-armv7l/wheel/test copying build/lib/test/test_mecab_wrapper_python3.py -> build/bdist.linux-armv7l/wheel/test copying build/lib/test/test_kytea_wrapper_python2.py -> build/bdist.linux-armv7l/wheel/test copying build/lib/test/test_all.py -> build/bdist.linux-armv7l/wheel/test copying build/lib/test/test_jumanpp_wrapper_python3.py -> build/bdist.linux-armv7l/wheel/test copying build/lib/test/__init__.py -> build/bdist.linux-armv7l/wheel/test creating build/bdist.linux-armv7l/wheel/test/common copying build/lib/test/common/test_server_handler.py -> build/bdist.linux-armv7l/wheel/test/common copying build/lib/test/common/__init__.py -> build/bdist.linux-armv7l/wheel/test/common copying build/lib/test/test_kytea_wrapper_python3.py -> build/bdist.linux-armv7l/wheel/test copying build/lib/test/test_juman_wrapper_python3.py -> build/bdist.linux-armv7l/wheel/test running install_egg_info running egg_info writing JapaneseTokenizer.egg-info/PKG-INFO writing requirements to JapaneseTokenizer.egg-info/requires.txt writing dependency_links to JapaneseTokenizer.egg-info/dependency_links.txt writing top-level names to JapaneseTokenizer.egg-info/top_level.txt reading manifest file 'JapaneseTokenizer.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching 'README_JP.md' warning: no files found matching 'examples' warning: no files found matching 'test' writing manifest file 'JapaneseTokenizer.egg-info/SOURCES.txt' Copying JapaneseTokenizer.egg-info to build/bdist.linux-armv7l/wheel/JapaneseTokenizer-1.3.5-py3.4.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/JapaneseTokenizer-1.3.5.dist-info/WHEEL creating '/tmp/tmp5q88o_iipip-wheel-/JapaneseTokenizer-1.3.5-py3-none-any.whl' and adding '.' to it adding 'JapaneseTokenizer/object_models.py' adding 'JapaneseTokenizer/datamodels.py' adding 'JapaneseTokenizer/init_logger.py' adding 'JapaneseTokenizer/__init__.py' adding 'JapaneseTokenizer/jumanpp_wrapper/jumanpp_wrapper.py' adding 'JapaneseTokenizer/jumanpp_wrapper/__init__.py' adding 'JapaneseTokenizer/kytea_wrapper/kytea_wrapper.py' adding 'JapaneseTokenizer/kytea_wrapper/__init__.py' adding 'JapaneseTokenizer/common/sever_handler.py' adding 'JapaneseTokenizer/common/text_preprocess.py' adding 'JapaneseTokenizer/common/timeout_handler.py' adding 'JapaneseTokenizer/common/juman_utils.py' adding 'JapaneseTokenizer/common/__init__.py' adding 'JapaneseTokenizer/mecab_wrapper/mecab_wrapper.py' adding 'JapaneseTokenizer/mecab_wrapper/__init__.py' adding 'JapaneseTokenizer/juman_wrapper/juman_wrapper.py' adding 'JapaneseTokenizer/juman_wrapper/__init__.py' adding 'test/test_juman_wrapper_python2.py' adding 'test/test_mecab_wrapper_python2.py' adding 'test/test_filter_python2.py' adding 'test/test_filter_python3.py' adding 'test/test_jumanpp_wrapper_python2.py' adding 'test/test_mecab_wrapper_python3.py' adding 'test/test_kytea_wrapper_python2.py' adding 'test/test_all.py' adding 'test/test_jumanpp_wrapper_python3.py' adding 'test/__init__.py' adding 'test/test_kytea_wrapper_python3.py' adding 'test/test_juman_wrapper_python3.py' adding 'test/common/test_server_handler.py' adding 'test/common/__init__.py' adding 'JapaneseTokenizer-1.3.5.dist-info/DESCRIPTION.rst' adding 'JapaneseTokenizer-1.3.5.dist-info/metadata.json' adding 'JapaneseTokenizer-1.3.5.dist-info/top_level.txt' adding 'JapaneseTokenizer-1.3.5.dist-info/WHEEL' adding 'JapaneseTokenizer-1.3.5.dist-info/METADATA' adding 'JapaneseTokenizer-1.3.5.dist-info/RECORD' Running setup.py bdist_wheel for JapaneseTokenizer: finished with status 'done' Stored in directory: /tmp/tmp_9qx_kp2 Successfully built JapaneseTokenizer Cleaning up... Removing source in /tmp/pip-build-md0x29rt/JapaneseTokenizer