Created temporary directory: /tmp/pip-ephem-wheel-cache-w6qb7nr2 Created temporary directory: /tmp/pip-req-tracker-f1g50939 Created requirements tracker '/tmp/pip-req-tracker-f1g50939' Created temporary directory: /tmp/pip-wheel-i5iw94zs Collecting scrapy-proxy==0.2 1 location(s) to search for versions of scrapy-proxy: * https://pypi.org/simple/scrapy-proxy/ Getting page https://pypi.org/simple/scrapy-proxy/ Analyzing links from page https://pypi.org/simple/scrapy-proxy/ Found link https://files.pythonhosted.org/packages/8b/68/1227752a9588357addceb3d4325a6f9f4d50890d72cdacd75771adeee4da/scrapy-proxy-0.2.tar.gz#sha256=701bef5614c36f6e31b966a38fe541318e5769cbc6535e5f9365b5283b64a5a4 (from https://pypi.org/simple/scrapy-proxy/), version: 0.2 Using version 0.2 (newest of versions: 0.2) Created temporary directory: /tmp/pip-unpack-mfcrxelx Downloading https://files.pythonhosted.org/packages/8b/68/1227752a9588357addceb3d4325a6f9f4d50890d72cdacd75771adeee4da/scrapy-proxy-0.2.tar.gz Downloading from URL https://files.pythonhosted.org/packages/8b/68/1227752a9588357addceb3d4325a6f9f4d50890d72cdacd75771adeee4da/scrapy-proxy-0.2.tar.gz#sha256=701bef5614c36f6e31b966a38fe541318e5769cbc6535e5f9365b5283b64a5a4 (from https://pypi.org/simple/scrapy-proxy/) Added scrapy-proxy==0.2 from https://files.pythonhosted.org/packages/8b/68/1227752a9588357addceb3d4325a6f9f4d50890d72cdacd75771adeee4da/scrapy-proxy-0.2.tar.gz#sha256=701bef5614c36f6e31b966a38fe541318e5769cbc6535e5f9365b5283b64a5a4 to build tracker '/tmp/pip-req-tracker-f1g50939' Running setup.py (path:/tmp/pip-wheel-i5iw94zs/scrapy-proxy/setup.py) egg_info for package scrapy-proxy Running command python setup.py egg_info running egg_info creating pip-egg-info/scrapy_proxy.egg-info writing dependency_links to pip-egg-info/scrapy_proxy.egg-info/dependency_links.txt writing top-level names to pip-egg-info/scrapy_proxy.egg-info/top_level.txt writing requirements to pip-egg-info/scrapy_proxy.egg-info/requires.txt writing pip-egg-info/scrapy_proxy.egg-info/PKG-INFO writing manifest file 'pip-egg-info/scrapy_proxy.egg-info/SOURCES.txt' reading manifest file 'pip-egg-info/scrapy_proxy.egg-info/SOURCES.txt' writing manifest file 'pip-egg-info/scrapy_proxy.egg-info/SOURCES.txt' Source in /tmp/pip-wheel-i5iw94zs/scrapy-proxy has version 0.2, which satisfies requirement scrapy-proxy==0.2 from https://files.pythonhosted.org/packages/8b/68/1227752a9588357addceb3d4325a6f9f4d50890d72cdacd75771adeee4da/scrapy-proxy-0.2.tar.gz#sha256=701bef5614c36f6e31b966a38fe541318e5769cbc6535e5f9365b5283b64a5a4 Removed scrapy-proxy==0.2 from https://files.pythonhosted.org/packages/8b/68/1227752a9588357addceb3d4325a6f9f4d50890d72cdacd75771adeee4da/scrapy-proxy-0.2.tar.gz#sha256=701bef5614c36f6e31b966a38fe541318e5769cbc6535e5f9365b5283b64a5a4 from build tracker '/tmp/pip-req-tracker-f1g50939' Building wheels for collected packages: scrapy-proxy Created temporary directory: /tmp/pip-wheel-ampe2t_o Running setup.py bdist_wheel for scrapy-proxy: started Destination directory: /tmp/pip-wheel-ampe2t_o Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-wheel-i5iw94zs/scrapy-proxy/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-ampe2t_o running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/scrapy_proxy copying scrapy_proxy/middleware.py -> build/lib/scrapy_proxy copying scrapy_proxy/__init__.py -> build/lib/scrapy_proxy installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/scrapy_proxy copying build/lib/scrapy_proxy/middleware.py -> build/bdist.linux-armv7l/wheel/scrapy_proxy copying build/lib/scrapy_proxy/__init__.py -> build/bdist.linux-armv7l/wheel/scrapy_proxy running install_egg_info running egg_info creating scrapy_proxy.egg-info writing dependency_links to scrapy_proxy.egg-info/dependency_links.txt writing scrapy_proxy.egg-info/PKG-INFO writing top-level names to scrapy_proxy.egg-info/top_level.txt writing requirements to scrapy_proxy.egg-info/requires.txt writing manifest file 'scrapy_proxy.egg-info/SOURCES.txt' reading manifest file 'scrapy_proxy.egg-info/SOURCES.txt' writing manifest file 'scrapy_proxy.egg-info/SOURCES.txt' Copying scrapy_proxy.egg-info to build/bdist.linux-armv7l/wheel/scrapy_proxy-0.2-py3.4.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/scrapy_proxy-0.2.dist-info/WHEEL creating '/tmp/pip-wheel-ampe2t_o/scrapy_proxy-0.2-py3-none-any.whl' and adding 'build/bdist.linux-armv7l/wheel' to it adding 'scrapy_proxy/__init__.py' adding 'scrapy_proxy/middleware.py' adding 'scrapy_proxy-0.2.dist-info/METADATA' adding 'scrapy_proxy-0.2.dist-info/WHEEL' adding 'scrapy_proxy-0.2.dist-info/top_level.txt' adding 'scrapy_proxy-0.2.dist-info/RECORD' removing build/bdist.linux-armv7l/wheel Running setup.py bdist_wheel for scrapy-proxy: finished with status 'done' Stored in directory: /tmp/tmp4t_y6m0q Successfully built scrapy-proxy Cleaning up... Removing source in /tmp/pip-wheel-i5iw94zs/scrapy-proxy Removed build tracker '/tmp/pip-req-tracker-f1g50939'