Created temporary directory: /tmp/pip-ephem-wheel-cache-y9nd14vz Created temporary directory: /tmp/pip-req-tracker-2r6ee1_c Created requirements tracker '/tmp/pip-req-tracker-2r6ee1_c' Created temporary directory: /tmp/pip-wheel-64huivt0 Collecting scrapy-proxy==0.3 1 location(s) to search for versions of scrapy-proxy: * https://pypi.org/simple/scrapy-proxy/ Getting page https://pypi.org/simple/scrapy-proxy/ Analyzing links from page https://pypi.org/simple/scrapy-proxy/ Found link https://files.pythonhosted.org/packages/8b/68/1227752a9588357addceb3d4325a6f9f4d50890d72cdacd75771adeee4da/scrapy-proxy-0.2.tar.gz#sha256=701bef5614c36f6e31b966a38fe541318e5769cbc6535e5f9365b5283b64a5a4 (from https://pypi.org/simple/scrapy-proxy/), version: 0.2 Found link https://files.pythonhosted.org/packages/40/02/4f488f2acc81e506882a7ad6316b6cd20f621ee6be86093fb18c0f084daf/scrapy-proxy-0.3.tar.gz#sha256=a924f75cf30bc6e0de43df1bd0d14538e46dcc7f700bfe7cbe8399785dcfcdad (from https://pypi.org/simple/scrapy-proxy/), version: 0.3 Using version 0.3 (newest of versions: 0.3) Created temporary directory: /tmp/pip-unpack-fzv1uzoz Downloading https://files.pythonhosted.org/packages/40/02/4f488f2acc81e506882a7ad6316b6cd20f621ee6be86093fb18c0f084daf/scrapy-proxy-0.3.tar.gz Downloading from URL https://files.pythonhosted.org/packages/40/02/4f488f2acc81e506882a7ad6316b6cd20f621ee6be86093fb18c0f084daf/scrapy-proxy-0.3.tar.gz#sha256=a924f75cf30bc6e0de43df1bd0d14538e46dcc7f700bfe7cbe8399785dcfcdad (from https://pypi.org/simple/scrapy-proxy/) Added scrapy-proxy==0.3 from https://files.pythonhosted.org/packages/40/02/4f488f2acc81e506882a7ad6316b6cd20f621ee6be86093fb18c0f084daf/scrapy-proxy-0.3.tar.gz#sha256=a924f75cf30bc6e0de43df1bd0d14538e46dcc7f700bfe7cbe8399785dcfcdad to build tracker '/tmp/pip-req-tracker-2r6ee1_c' Running setup.py (path:/tmp/pip-wheel-64huivt0/scrapy-proxy/setup.py) egg_info for package scrapy-proxy Running command python setup.py egg_info running egg_info creating pip-egg-info/scrapy_proxy.egg-info writing pip-egg-info/scrapy_proxy.egg-info/PKG-INFO writing requirements to pip-egg-info/scrapy_proxy.egg-info/requires.txt writing top-level names to pip-egg-info/scrapy_proxy.egg-info/top_level.txt writing dependency_links to pip-egg-info/scrapy_proxy.egg-info/dependency_links.txt writing manifest file 'pip-egg-info/scrapy_proxy.egg-info/SOURCES.txt' reading manifest file 'pip-egg-info/scrapy_proxy.egg-info/SOURCES.txt' writing manifest file 'pip-egg-info/scrapy_proxy.egg-info/SOURCES.txt' Source in /tmp/pip-wheel-64huivt0/scrapy-proxy has version 0.3, which satisfies requirement scrapy-proxy==0.3 from https://files.pythonhosted.org/packages/40/02/4f488f2acc81e506882a7ad6316b6cd20f621ee6be86093fb18c0f084daf/scrapy-proxy-0.3.tar.gz#sha256=a924f75cf30bc6e0de43df1bd0d14538e46dcc7f700bfe7cbe8399785dcfcdad Removed scrapy-proxy==0.3 from https://files.pythonhosted.org/packages/40/02/4f488f2acc81e506882a7ad6316b6cd20f621ee6be86093fb18c0f084daf/scrapy-proxy-0.3.tar.gz#sha256=a924f75cf30bc6e0de43df1bd0d14538e46dcc7f700bfe7cbe8399785dcfcdad from build tracker '/tmp/pip-req-tracker-2r6ee1_c' Building wheels for collected packages: scrapy-proxy Created temporary directory: /tmp/pip-wheel-zavml40c Running setup.py bdist_wheel for scrapy-proxy: started Destination directory: /tmp/pip-wheel-zavml40c Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-wheel-64huivt0/scrapy-proxy/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-zavml40c running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/scrapy_proxy copying scrapy_proxy/__init__.py -> build/lib/scrapy_proxy copying scrapy_proxy/middleware.py -> build/lib/scrapy_proxy installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/scrapy_proxy copying build/lib/scrapy_proxy/__init__.py -> build/bdist.linux-armv7l/wheel/scrapy_proxy copying build/lib/scrapy_proxy/middleware.py -> build/bdist.linux-armv7l/wheel/scrapy_proxy running install_egg_info running egg_info creating scrapy_proxy.egg-info writing scrapy_proxy.egg-info/PKG-INFO writing dependency_links to scrapy_proxy.egg-info/dependency_links.txt writing top-level names to scrapy_proxy.egg-info/top_level.txt writing requirements to scrapy_proxy.egg-info/requires.txt writing manifest file 'scrapy_proxy.egg-info/SOURCES.txt' reading manifest file 'scrapy_proxy.egg-info/SOURCES.txt' writing manifest file 'scrapy_proxy.egg-info/SOURCES.txt' Copying scrapy_proxy.egg-info to build/bdist.linux-armv7l/wheel/scrapy_proxy-0.3-py3.4.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/scrapy_proxy-0.3.dist-info/WHEEL creating '/tmp/pip-wheel-zavml40c/scrapy_proxy-0.3-py3-none-any.whl' and adding 'build/bdist.linux-armv7l/wheel' to it adding 'scrapy_proxy/__init__.py' adding 'scrapy_proxy/middleware.py' adding 'scrapy_proxy-0.3.dist-info/METADATA' adding 'scrapy_proxy-0.3.dist-info/WHEEL' adding 'scrapy_proxy-0.3.dist-info/top_level.txt' adding 'scrapy_proxy-0.3.dist-info/RECORD' removing build/bdist.linux-armv7l/wheel Running setup.py bdist_wheel for scrapy-proxy: finished with status 'done' Stored in directory: /tmp/tmp64v73z8j Successfully built scrapy-proxy Cleaning up... Removing source in /tmp/pip-wheel-64huivt0/scrapy-proxy Removed build tracker '/tmp/pip-req-tracker-2r6ee1_c'