Collecting web-crawler-plus==0.9.11 1 location(s) to search for versions of web-crawler-plus: * https://pypi.python.org/simple/web-crawler-plus/ Getting page https://pypi.python.org/simple/web-crawler-plus/ Analyzing links from page https://pypi.org/simple/web-crawler-plus/ Found link https://files.pythonhosted.org/packages/6a/05/c3e032a2ab7f5fa66be7414fc32773ed6910fb9dde5f710a3e14fa37d709/web-crawler-plus-0.9.0.beta.tar.gz#sha256=a7d938850ea1aebfcfe6ebdfe5ebb6b4e6efcd4c73946be9266962df5471dcab (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.0.beta Found link https://files.pythonhosted.org/packages/bf/50/86a4d0febc65b00b37d27ccd2fa013eeae36b6376243a4e417df0b2e1473/web-crawler-plus-0.9.1a0.tar.gz#sha256=c494c72aace65d4ba17187d99793f1e4dad89fe11cae2f3740638c5e42d6e2db (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.1a0 Found link https://files.pythonhosted.org/packages/8c/d6/d9cfcffa6232c12d7ff69918a934307f67c438247c620d4c1f2e6b603c1a/web-crawler-plus-0.9.1.beta.tar.gz#sha256=c5f5bab37ed680773154ccaeb8f697ab1c0fffc72df3bafcf9b4124a3cf1c450 (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.1.beta Found link https://files.pythonhosted.org/packages/3e/29/5aafea76c8b1465c4ced4c468d471ac9067912d8355fbe24b7f2a1d7b6ba/web-crawler-plus-0.9.1.beta1.tar.gz#sha256=10f73529863c6057ac697ba9d05d2cbbcaee590855270080b8de8988a89428ae (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.1.beta1 Found link https://files.pythonhosted.org/packages/51/38/d8b8ad7bc1cbfc54824dc86b23ea7f70c681d225b2e52b176aa590ff313a/web-crawler-plus-0.9.2.tar.gz#sha256=631ad7c94e54a02ef87dc0466a502dc7b9f71885dff8e3252c0b7c0672de8259 (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.2 Found link https://files.pythonhosted.org/packages/7b/aa/6e10c296413897658396415e28b28b71a57dc935e8a9c48d244b0fe38962/web-crawler-plus-0.9.3.beta.tar.gz#sha256=a6a579ce9c1bfdccc842a02daf7f6a737afb3b083b658ad9385d0c4a87f85b71 (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.3.beta Found link https://files.pythonhosted.org/packages/56/60/b38a8f0df0dc255d17c045f06dffd6d145443b3305c68d37716f1151b4f9/web-crawler-plus-0.9.4.beta.tar.gz#sha256=59643df37ea62513e69d446384d28425f0d2c0e7f38108a453fcf79ae260f426 (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.4.beta Found link https://files.pythonhosted.org/packages/3d/e9/52577a98eddc43f3106c324032686f41b49b183386cfa0512a0294954a68/web-crawler-plus-0.9.5.beta.tar.gz#sha256=d5f0bbc4b57c1186e6e03324b41f3a2db57a0ef3b2f4d00884323a3d64fc23ad (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.5.beta Found link https://files.pythonhosted.org/packages/ed/4e/dff7fde8699d3983fd336697ff7c1aba80ea618ec8905e9d7a2992102ecf/web-crawler-plus-0.9.6.beta.tar.gz#sha256=2caa55f72230c79b70a6723c37b84f75bbbaa44200102d32ea56324b897bfddb (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.6.beta Found link https://files.pythonhosted.org/packages/c4/2c/eef68baed9a728ced084f4c031fcda21cba587c45d52ded26f4cdece29ac/web-crawler-plus-0.9.7a0.tar.gz#sha256=6320de0d1a625efd9565042a853c00d6a42563377ab0d3c44160af5cef0d1107 (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.7a0 Found link https://files.pythonhosted.org/packages/3f/a3/6445df10056f44fe194bd8427e86601270a7b5a37edacadde260e51461d6/web-crawler-plus-0.9.8.tar.gz#sha256=3e7b5200d06deaba979b0bef64c0a408c5ce87942b1319fe6484869163d3c6ed (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.8 Found link https://files.pythonhosted.org/packages/7a/e3/f84e7f0e4424075379bc20139ce70f528c6cbe0cc3b4f908a53fdbf16696/web-crawler-plus-0.9.9.tar.gz#sha256=dcb025bc59f6eef7dd38f484147b5a899f1cf6088387ee2229fab257ac755fcb (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.9 Found link https://files.pythonhosted.org/packages/78/cc/01a85b3eb8b03f490c078b655b401164eca46edc9df997989c43ed71aa40/web-crawler-plus-0.9.10.tar.gz#sha256=0ac7ce25b8f17935a5977e0581ac893cb86479a85ade175fd792fa61f22593bf (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.10 Found link https://files.pythonhosted.org/packages/04/cb/544262fecbceed7abee718225ddacb43b27da864cafe87d69d1fc7830435/web-crawler-plus-0.9.11.tar.gz#sha256=de2a0277f1df1cfd72f4a2ea19206dfb4979b35067f4774f101a9070bc911bcf (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.11 Found link https://files.pythonhosted.org/packages/56/23/34545ebaff47e33a57f4637f6d110fad1d60425249f9076899c51a6bd363/web-crawler-plus-0.9.12.tar.gz#sha256=7719e5fb8bafd3578289df268aaa04e58a642c93d7e68af68204ed745e1c65e0 (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.12 Found link https://files.pythonhosted.org/packages/4b/0d/c2c489463716958fd56bac20c2f2345d9a467fb7baa65ca5d252cd0aa2c9/web-crawler-plus-0.9.13.tar.gz#sha256=a12b87a27b5ae32a81d54e8040ff37762685d6c35ea582ab0c26346ec2945efe (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.13 Found link https://files.pythonhosted.org/packages/c2/3c/5ca17d3a593f64572c5e286de0597a550cfce3bbe332ea5d603e8faf5705/web-crawler-plus-0.9.14.tar.gz#sha256=a09b346dc28775d6682917ee788630d146945a00a9c5f2e5eb6f9b7be1cd1e53 (from https://pypi.org/simple/web-crawler-plus/), version: 0.9.14 Using version 0.9.11 (newest of versions: 0.9.11) Downloading https://files.pythonhosted.org/packages/04/cb/544262fecbceed7abee718225ddacb43b27da864cafe87d69d1fc7830435/web-crawler-plus-0.9.11.tar.gz Downloading from URL https://files.pythonhosted.org/packages/04/cb/544262fecbceed7abee718225ddacb43b27da864cafe87d69d1fc7830435/web-crawler-plus-0.9.11.tar.gz#sha256=de2a0277f1df1cfd72f4a2ea19206dfb4979b35067f4774f101a9070bc911bcf (from https://pypi.org/simple/web-crawler-plus/) Running setup.py (path:/tmp/pip-build-8v70zf64/web-crawler-plus/setup.py) egg_info for package web-crawler-plus Running command python setup.py egg_info running egg_info creating pip-egg-info/web_crawler_plus.egg-info writing dependency_links to pip-egg-info/web_crawler_plus.egg-info/dependency_links.txt writing requirements to pip-egg-info/web_crawler_plus.egg-info/requires.txt writing pip-egg-info/web_crawler_plus.egg-info/PKG-INFO writing top-level names to pip-egg-info/web_crawler_plus.egg-info/top_level.txt writing manifest file 'pip-egg-info/web_crawler_plus.egg-info/SOURCES.txt' reading manifest file 'pip-egg-info/web_crawler_plus.egg-info/SOURCES.txt' writing manifest file 'pip-egg-info/web_crawler_plus.egg-info/SOURCES.txt' Source in /tmp/pip-build-8v70zf64/web-crawler-plus has version 0.9.11, which satisfies requirement web-crawler-plus==0.9.11 from https://files.pythonhosted.org/packages/04/cb/544262fecbceed7abee718225ddacb43b27da864cafe87d69d1fc7830435/web-crawler-plus-0.9.11.tar.gz#sha256=de2a0277f1df1cfd72f4a2ea19206dfb4979b35067f4774f101a9070bc911bcf Building wheels for collected packages: web-crawler-plus Running setup.py bdist_wheel for web-crawler-plus: started Destination directory: /tmp/tmp5e6r551mpip-wheel- Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-8v70zf64/web-crawler-plus/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/tmp5e6r551mpip-wheel- running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/webcrawler copying webcrawler/exceptions.py -> build/lib/webcrawler copying webcrawler/settings.py -> build/lib/webcrawler copying webcrawler/__init__.py -> build/lib/webcrawler copying webcrawler/parser.py -> build/lib/webcrawler creating build/lib/tests copying tests/__init__.py -> build/lib/tests creating build/lib/webcrawler/spiders copying webcrawler/spiders/generic.py -> build/lib/webcrawler/spiders copying webcrawler/spiders/website.py -> build/lib/webcrawler/spiders copying webcrawler/spiders/__init__.py -> build/lib/webcrawler/spiders copying webcrawler/spiders/feeds.py -> build/lib/webcrawler/spiders creating build/lib/webcrawler/pipelines copying webcrawler/pipelines/mongodb.py -> build/lib/webcrawler/pipelines copying webcrawler/pipelines/elasticsearch.py -> build/lib/webcrawler/pipelines copying webcrawler/pipelines/__init__.py -> build/lib/webcrawler/pipelines copying webcrawler/pipelines/solr.py -> build/lib/webcrawler/pipelines creating build/lib/webcrawler/httpcache copying webcrawler/httpcache/mongodb.py -> build/lib/webcrawler/httpcache copying webcrawler/httpcache/elasticsearch.py -> build/lib/webcrawler/httpcache copying webcrawler/httpcache/__init__.py -> build/lib/webcrawler/httpcache copying webcrawler/httpcache/solr.py -> build/lib/webcrawler/httpcache creating build/lib/webcrawler/spiders/search_engines copying webcrawler/spiders/search_engines/base.py -> build/lib/webcrawler/spiders/search_engines copying webcrawler/spiders/search_engines/bing.py -> build/lib/webcrawler/spiders/search_engines copying webcrawler/spiders/search_engines/__init__.py -> build/lib/webcrawler/spiders/search_engines creating build/lib/webcrawler/utils copying webcrawler/utils/selectors.py -> build/lib/webcrawler/utils copying webcrawler/utils/config.py -> build/lib/webcrawler/utils copying webcrawler/utils/url.py -> build/lib/webcrawler/utils copying webcrawler/utils/__init__.py -> build/lib/webcrawler/utils installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/webcrawler copying build/lib/webcrawler/exceptions.py -> build/bdist.linux-armv7l/wheel/webcrawler creating build/bdist.linux-armv7l/wheel/webcrawler/spiders creating build/bdist.linux-armv7l/wheel/webcrawler/spiders/search_engines copying build/lib/webcrawler/spiders/search_engines/base.py -> build/bdist.linux-armv7l/wheel/webcrawler/spiders/search_engines copying build/lib/webcrawler/spiders/search_engines/bing.py -> build/bdist.linux-armv7l/wheel/webcrawler/spiders/search_engines copying build/lib/webcrawler/spiders/search_engines/__init__.py -> build/bdist.linux-armv7l/wheel/webcrawler/spiders/search_engines copying build/lib/webcrawler/spiders/generic.py -> build/bdist.linux-armv7l/wheel/webcrawler/spiders copying build/lib/webcrawler/spiders/website.py -> build/bdist.linux-armv7l/wheel/webcrawler/spiders copying build/lib/webcrawler/spiders/__init__.py -> build/bdist.linux-armv7l/wheel/webcrawler/spiders copying build/lib/webcrawler/spiders/feeds.py -> build/bdist.linux-armv7l/wheel/webcrawler/spiders copying build/lib/webcrawler/settings.py -> build/bdist.linux-armv7l/wheel/webcrawler creating build/bdist.linux-armv7l/wheel/webcrawler/pipelines copying build/lib/webcrawler/pipelines/mongodb.py -> build/bdist.linux-armv7l/wheel/webcrawler/pipelines copying build/lib/webcrawler/pipelines/elasticsearch.py -> build/bdist.linux-armv7l/wheel/webcrawler/pipelines copying build/lib/webcrawler/pipelines/__init__.py -> build/bdist.linux-armv7l/wheel/webcrawler/pipelines copying build/lib/webcrawler/pipelines/solr.py -> build/bdist.linux-armv7l/wheel/webcrawler/pipelines creating build/bdist.linux-armv7l/wheel/webcrawler/httpcache copying build/lib/webcrawler/httpcache/mongodb.py -> build/bdist.linux-armv7l/wheel/webcrawler/httpcache copying build/lib/webcrawler/httpcache/elasticsearch.py -> build/bdist.linux-armv7l/wheel/webcrawler/httpcache copying build/lib/webcrawler/httpcache/__init__.py -> build/bdist.linux-armv7l/wheel/webcrawler/httpcache copying build/lib/webcrawler/httpcache/solr.py -> build/bdist.linux-armv7l/wheel/webcrawler/httpcache creating build/bdist.linux-armv7l/wheel/webcrawler/utils copying build/lib/webcrawler/utils/selectors.py -> build/bdist.linux-armv7l/wheel/webcrawler/utils copying build/lib/webcrawler/utils/config.py -> build/bdist.linux-armv7l/wheel/webcrawler/utils copying build/lib/webcrawler/utils/url.py -> build/bdist.linux-armv7l/wheel/webcrawler/utils copying build/lib/webcrawler/utils/__init__.py -> build/bdist.linux-armv7l/wheel/webcrawler/utils copying build/lib/webcrawler/__init__.py -> build/bdist.linux-armv7l/wheel/webcrawler copying build/lib/webcrawler/parser.py -> build/bdist.linux-armv7l/wheel/webcrawler creating build/bdist.linux-armv7l/wheel/tests copying build/lib/tests/__init__.py -> build/bdist.linux-armv7l/wheel/tests running install_egg_info running egg_info writing dependency_links to web_crawler_plus.egg-info/dependency_links.txt writing requirements to web_crawler_plus.egg-info/requires.txt writing web_crawler_plus.egg-info/PKG-INFO writing top-level names to web_crawler_plus.egg-info/top_level.txt reading manifest file 'web_crawler_plus.egg-info/SOURCES.txt' writing manifest file 'web_crawler_plus.egg-info/SOURCES.txt' Copying web_crawler_plus.egg-info to build/bdist.linux-armv7l/wheel/web_crawler_plus-0.9.11-py3.4.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/web_crawler_plus-0.9.11.dist-info/WHEEL Running setup.py bdist_wheel for web-crawler-plus: finished with status 'done' Stored in directory: /tmp/tmpgb9obxlt Successfully built web-crawler-plus Cleaning up... Removing source in /tmp/pip-build-8v70zf64/web-crawler-plus