Created temporary directory: /tmp/pip-ephem-wheel-cache-mppqydyb Created temporary directory: /tmp/pip-req-tracker-j0_1xgcq Created requirements tracker '/tmp/pip-req-tracker-j0_1xgcq' Created temporary directory: /tmp/pip-wheel-kyg0c__q Collecting scrapio==0.1.4 1 location(s) to search for versions of scrapio: * https://pypi.org/simple/scrapio/ Getting page https://pypi.org/simple/scrapio/ Analyzing links from page https://pypi.org/simple/scrapio/ Found link https://files.pythonhosted.org/packages/c0/cc/183d7d909626a9673e839a53b737dd24f7ab4311d27e8a4d7c4869416278/scrapio-0.1.0.tar.gz#sha256=478d5218508e6ac2446ef655852b164952c51c183de352427ab24a276dfbd99e (from https://pypi.org/simple/scrapio/) (requires-python:>=3.5.3), version: 0.1.0 Found link https://files.pythonhosted.org/packages/f1/fb/5cd52b0512638be3a83d5265cdb36d2e06c38fefcf3b3b523efe02ff7c23/scrapio-0.1.1.tar.gz#sha256=0fc3bd3062b31462b96c51e94c22e4c787586fbbdd51690b28e0057d83174ef4 (from https://pypi.org/simple/scrapio/) (requires-python:>=3.5.3), version: 0.1.1 Found link https://files.pythonhosted.org/packages/c6/c5/b3d9ab80f9381cd53c9877dc7cf128ab2631bb0c5ada7301d671fa70ba4e/scrapio-0.1.2.tar.gz#sha256=b85b1a8cb6602aa0be4458ad4cfb9b5c277661210246ecca992067d1c633b825 (from https://pypi.org/simple/scrapio/) (requires-python:>=3.5.3), version: 0.1.2 Found link https://files.pythonhosted.org/packages/6d/ce/0d84c9fb7d98b83f2052682106e69db96d2a62dec74f5420430ed2ed163e/scrapio-0.1.3.tar.gz#sha256=f9d1bcecdc3e2b63f6fb6264d8826cd809b4d538919b6f974002305addd597c2 (from https://pypi.org/simple/scrapio/) (requires-python:>=3.5.3), version: 0.1.3 Found link https://files.pythonhosted.org/packages/10/d3/c211b38edc03c03f52941b11cc00af21819ae6c09795c9e8c781d327e4ff/scrapio-0.1.4.tar.gz#sha256=07157bd42e16eb54d31317dc37a6f6e34bc24dac1753dae6e3ba5b73f3655cf0 (from https://pypi.org/simple/scrapio/) (requires-python:>=3.5.3), version: 0.1.4 Using version 0.1.4 (newest of versions: 0.1.4) Created temporary directory: /tmp/pip-unpack-hqmt62vx Downloading https://files.pythonhosted.org/packages/10/d3/c211b38edc03c03f52941b11cc00af21819ae6c09795c9e8c781d327e4ff/scrapio-0.1.4.tar.gz Downloading from URL https://files.pythonhosted.org/packages/10/d3/c211b38edc03c03f52941b11cc00af21819ae6c09795c9e8c781d327e4ff/scrapio-0.1.4.tar.gz#sha256=07157bd42e16eb54d31317dc37a6f6e34bc24dac1753dae6e3ba5b73f3655cf0 (from https://pypi.org/simple/scrapio/) (requires-python:>=3.5.3) Added scrapio==0.1.4 from https://files.pythonhosted.org/packages/10/d3/c211b38edc03c03f52941b11cc00af21819ae6c09795c9e8c781d327e4ff/scrapio-0.1.4.tar.gz#sha256=07157bd42e16eb54d31317dc37a6f6e34bc24dac1753dae6e3ba5b73f3655cf0 to build tracker '/tmp/pip-req-tracker-j0_1xgcq' Running setup.py (path:/tmp/pip-wheel-kyg0c__q/scrapio/setup.py) egg_info for package scrapio Running command python setup.py egg_info running egg_info creating pip-egg-info/scrapio.egg-info writing top-level names to pip-egg-info/scrapio.egg-info/top_level.txt writing requirements to pip-egg-info/scrapio.egg-info/requires.txt writing pip-egg-info/scrapio.egg-info/PKG-INFO writing dependency_links to pip-egg-info/scrapio.egg-info/dependency_links.txt writing manifest file 'pip-egg-info/scrapio.egg-info/SOURCES.txt' reading manifest file 'pip-egg-info/scrapio.egg-info/SOURCES.txt' writing manifest file 'pip-egg-info/scrapio.egg-info/SOURCES.txt' Source in /tmp/pip-wheel-kyg0c__q/scrapio has version 0.1.4, which satisfies requirement scrapio==0.1.4 from https://files.pythonhosted.org/packages/10/d3/c211b38edc03c03f52941b11cc00af21819ae6c09795c9e8c781d327e4ff/scrapio-0.1.4.tar.gz#sha256=07157bd42e16eb54d31317dc37a6f6e34bc24dac1753dae6e3ba5b73f3655cf0 Removed scrapio==0.1.4 from https://files.pythonhosted.org/packages/10/d3/c211b38edc03c03f52941b11cc00af21819ae6c09795c9e8c781d327e4ff/scrapio-0.1.4.tar.gz#sha256=07157bd42e16eb54d31317dc37a6f6e34bc24dac1753dae6e3ba5b73f3655cf0 from build tracker '/tmp/pip-req-tracker-j0_1xgcq' Building wheels for collected packages: scrapio Created temporary directory: /tmp/pip-wheel-2527s94d Running setup.py bdist_wheel for scrapio: started Destination directory: /tmp/pip-wheel-2527s94d Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-wheel-kyg0c__q/scrapio/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-2527s94d running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/scrapio copying scrapio/__init__.py -> build/lib/scrapio creating build/lib/scrapio/requests copying scrapio/requests/__init__.py -> build/lib/scrapio/requests copying scrapio/requests/get.py -> build/lib/scrapio/requests creating build/lib/scrapio/scrapers copying scrapio/scrapers/splash_crawler.py -> build/lib/scrapio/scrapers copying scrapio/scrapers/__init__.py -> build/lib/scrapio/scrapers copying scrapio/scrapers/base_crawler.py -> build/lib/scrapio/scrapers creating build/lib/scrapio/structures copying scrapio/structures/filtering.py -> build/lib/scrapio/structures copying scrapio/structures/proxies.py -> build/lib/scrapio/structures copying scrapio/structures/__init__.py -> build/lib/scrapio/structures copying scrapio/structures/queues.py -> build/lib/scrapio/structures creating build/lib/scrapio/mixins copying scrapio/mixins/__init__.py -> build/lib/scrapio/mixins copying scrapio/mixins/mongo.py -> build/lib/scrapio/mixins creating build/lib/scrapio/parsing copying scrapio/parsing/valid_url.py -> build/lib/scrapio/parsing copying scrapio/parsing/links.py -> build/lib/scrapio/parsing copying scrapio/parsing/__init__.py -> build/lib/scrapio/parsing creating build/lib/scrapio/utils copying scrapio/utils/urls.py -> build/lib/scrapio/utils copying scrapio/utils/__init__.py -> build/lib/scrapio/utils copying scrapio/utils/helpers.py -> build/lib/scrapio/utils creating build/lib/scrapio/scrapers/experimental copying scrapio/scrapers/experimental/cfcrawl.py -> build/lib/scrapio/scrapers/experimental copying scrapio/scrapers/experimental/__init__.py -> build/lib/scrapio/scrapers/experimental running egg_info writing requirements to scrapio.egg-info/requires.txt writing dependency_links to scrapio.egg-info/dependency_links.txt writing scrapio.egg-info/PKG-INFO writing top-level names to scrapio.egg-info/top_level.txt reading manifest file 'scrapio.egg-info/SOURCES.txt' writing manifest file 'scrapio.egg-info/SOURCES.txt' installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/scrapio creating build/bdist.linux-armv7l/wheel/scrapio/requests copying build/lib/scrapio/requests/__init__.py -> build/bdist.linux-armv7l/wheel/scrapio/requests copying build/lib/scrapio/requests/get.py -> build/bdist.linux-armv7l/wheel/scrapio/requests creating build/bdist.linux-armv7l/wheel/scrapio/scrapers creating build/bdist.linux-armv7l/wheel/scrapio/scrapers/experimental copying build/lib/scrapio/scrapers/experimental/cfcrawl.py -> build/bdist.linux-armv7l/wheel/scrapio/scrapers/experimental copying build/lib/scrapio/scrapers/experimental/__init__.py -> build/bdist.linux-armv7l/wheel/scrapio/scrapers/experimental copying build/lib/scrapio/scrapers/splash_crawler.py -> build/bdist.linux-armv7l/wheel/scrapio/scrapers copying build/lib/scrapio/scrapers/__init__.py -> build/bdist.linux-armv7l/wheel/scrapio/scrapers copying build/lib/scrapio/scrapers/base_crawler.py -> build/bdist.linux-armv7l/wheel/scrapio/scrapers creating build/bdist.linux-armv7l/wheel/scrapio/structures copying build/lib/scrapio/structures/filtering.py -> build/bdist.linux-armv7l/wheel/scrapio/structures copying build/lib/scrapio/structures/proxies.py -> build/bdist.linux-armv7l/wheel/scrapio/structures copying build/lib/scrapio/structures/__init__.py -> build/bdist.linux-armv7l/wheel/scrapio/structures copying build/lib/scrapio/structures/queues.py -> build/bdist.linux-armv7l/wheel/scrapio/structures creating build/bdist.linux-armv7l/wheel/scrapio/mixins copying build/lib/scrapio/mixins/__init__.py -> build/bdist.linux-armv7l/wheel/scrapio/mixins copying build/lib/scrapio/mixins/mongo.py -> build/bdist.linux-armv7l/wheel/scrapio/mixins copying build/lib/scrapio/__init__.py -> build/bdist.linux-armv7l/wheel/scrapio creating build/bdist.linux-armv7l/wheel/scrapio/parsing copying build/lib/scrapio/parsing/valid_url.py -> build/bdist.linux-armv7l/wheel/scrapio/parsing copying build/lib/scrapio/parsing/links.py -> build/bdist.linux-armv7l/wheel/scrapio/parsing copying build/lib/scrapio/parsing/__init__.py -> build/bdist.linux-armv7l/wheel/scrapio/parsing creating build/bdist.linux-armv7l/wheel/scrapio/utils copying build/lib/scrapio/utils/urls.py -> build/bdist.linux-armv7l/wheel/scrapio/utils copying build/lib/scrapio/utils/__init__.py -> build/bdist.linux-armv7l/wheel/scrapio/utils copying build/lib/scrapio/utils/helpers.py -> build/bdist.linux-armv7l/wheel/scrapio/utils running install_egg_info Copying scrapio.egg-info to build/bdist.linux-armv7l/wheel/scrapio-0.1.4-py3.5.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/scrapio-0.1.4.dist-info/WHEEL creating '/tmp/pip-wheel-2527s94d/scrapio-0.1.4-py3-none-any.whl' and adding 'build/bdist.linux-armv7l/wheel' to it adding 'scrapio/__init__.py' adding 'scrapio/mixins/__init__.py' adding 'scrapio/mixins/mongo.py' adding 'scrapio/parsing/__init__.py' adding 'scrapio/parsing/links.py' adding 'scrapio/parsing/valid_url.py' adding 'scrapio/requests/__init__.py' adding 'scrapio/requests/get.py' adding 'scrapio/scrapers/__init__.py' adding 'scrapio/scrapers/base_crawler.py' adding 'scrapio/scrapers/splash_crawler.py' adding 'scrapio/scrapers/experimental/__init__.py' adding 'scrapio/scrapers/experimental/cfcrawl.py' adding 'scrapio/structures/__init__.py' adding 'scrapio/structures/filtering.py' adding 'scrapio/structures/proxies.py' adding 'scrapio/structures/queues.py' adding 'scrapio/utils/__init__.py' adding 'scrapio/utils/helpers.py' adding 'scrapio/utils/urls.py' adding 'scrapio-0.1.4.dist-info/METADATA' adding 'scrapio-0.1.4.dist-info/WHEEL' adding 'scrapio-0.1.4.dist-info/top_level.txt' adding 'scrapio-0.1.4.dist-info/RECORD' removing build/bdist.linux-armv7l/wheel Running setup.py bdist_wheel for scrapio: finished with status 'done' Stored in directory: /tmp/tmpcvraluk_ Successfully built scrapio Cleaning up... Removing source in /tmp/pip-wheel-kyg0c__q/scrapio Removed build tracker '/tmp/pip-req-tracker-j0_1xgcq'