Created temporary directory: /tmp/pip-ephem-wheel-cache-9rqzum4o Created temporary directory: /tmp/pip-req-tracker-f057avh0 Created requirements tracker '/tmp/pip-req-tracker-f057avh0' Created temporary directory: /tmp/pip-wheel-g284_n2h Looking in indexes: https://pypi.python.org/simple Collecting scrapy-calaccess-crawler==0.0.1 1 location(s) to search for versions of scrapy-calaccess-crawler: * https://pypi.python.org/simple/scrapy-calaccess-crawler/ Getting page https://pypi.python.org/simple/scrapy-calaccess-crawler/ Analyzing links from page https://pypi.org/simple/scrapy-calaccess-crawler/ Found link https://files.pythonhosted.org/packages/22/de/cf522e37889e32ecf7a6ee6834890c0e0b00dace1c4d9366d0667958606a/scrapy-calaccess-crawler-0.0.1.tar.gz#sha256=90955fdad15b10836170f9e9a1e3984fed8991e11e727ddc4ab92af42531d529 (from https://pypi.org/simple/scrapy-calaccess-crawler/), version: 0.0.1 Using version 0.0.1 (newest of versions: 0.0.1) Created temporary directory: /tmp/pip-unpack-varwq04t Downloading https://files.pythonhosted.org/packages/22/de/cf522e37889e32ecf7a6ee6834890c0e0b00dace1c4d9366d0667958606a/scrapy-calaccess-crawler-0.0.1.tar.gz Downloading from URL https://files.pythonhosted.org/packages/22/de/cf522e37889e32ecf7a6ee6834890c0e0b00dace1c4d9366d0667958606a/scrapy-calaccess-crawler-0.0.1.tar.gz#sha256=90955fdad15b10836170f9e9a1e3984fed8991e11e727ddc4ab92af42531d529 (from https://pypi.org/simple/scrapy-calaccess-crawler/) Added scrapy-calaccess-crawler==0.0.1 from https://files.pythonhosted.org/packages/22/de/cf522e37889e32ecf7a6ee6834890c0e0b00dace1c4d9366d0667958606a/scrapy-calaccess-crawler-0.0.1.tar.gz#sha256=90955fdad15b10836170f9e9a1e3984fed8991e11e727ddc4ab92af42531d529 to build tracker '/tmp/pip-req-tracker-f057avh0' Running setup.py (path:/tmp/pip-wheel-g284_n2h/scrapy-calaccess-crawler/setup.py) egg_info for package scrapy-calaccess-crawler Running command python setup.py egg_info running egg_info creating pip-egg-info/scrapy_calaccess_crawler.egg-info writing pip-egg-info/scrapy_calaccess_crawler.egg-info/PKG-INFO writing dependency_links to pip-egg-info/scrapy_calaccess_crawler.egg-info/dependency_links.txt writing top-level names to pip-egg-info/scrapy_calaccess_crawler.egg-info/top_level.txt writing requirements to pip-egg-info/scrapy_calaccess_crawler.egg-info/requires.txt writing manifest file 'pip-egg-info/scrapy_calaccess_crawler.egg-info/SOURCES.txt' warning: manifest_maker: standard file '-c' not found reading manifest file 'pip-egg-info/scrapy_calaccess_crawler.egg-info/SOURCES.txt' writing manifest file 'pip-egg-info/scrapy_calaccess_crawler.egg-info/SOURCES.txt' Source in /tmp/pip-wheel-g284_n2h/scrapy-calaccess-crawler has version 0.0.1, which satisfies requirement scrapy-calaccess-crawler==0.0.1 from https://files.pythonhosted.org/packages/22/de/cf522e37889e32ecf7a6ee6834890c0e0b00dace1c4d9366d0667958606a/scrapy-calaccess-crawler-0.0.1.tar.gz#sha256=90955fdad15b10836170f9e9a1e3984fed8991e11e727ddc4ab92af42531d529 Removed scrapy-calaccess-crawler==0.0.1 from https://files.pythonhosted.org/packages/22/de/cf522e37889e32ecf7a6ee6834890c0e0b00dace1c4d9366d0667958606a/scrapy-calaccess-crawler-0.0.1.tar.gz#sha256=90955fdad15b10836170f9e9a1e3984fed8991e11e727ddc4ab92af42531d529 from build tracker '/tmp/pip-req-tracker-f057avh0' Building wheels for collected packages: scrapy-calaccess-crawler Created temporary directory: /tmp/pip-wheel-_4t6wst7 Running setup.py bdist_wheel for scrapy-calaccess-crawler: started Destination directory: /tmp/pip-wheel-_4t6wst7 Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-wheel-g284_n2h/scrapy-calaccess-crawler/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/pip-wheel-_4t6wst7 running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/calaccess_crawler copying calaccess_crawler/__init__.py -> build/lib/calaccess_crawler copying calaccess_crawler/settings.py -> build/lib/calaccess_crawler copying calaccess_crawler/proxies.py -> build/lib/calaccess_crawler copying calaccess_crawler/items.py -> build/lib/calaccess_crawler copying calaccess_crawler/pipelines.py -> build/lib/calaccess_crawler copying calaccess_crawler/loaders.py -> build/lib/calaccess_crawler copying calaccess_crawler/middlewares.py -> build/lib/calaccess_crawler creating build/lib/calaccess_crawler/spiders copying calaccess_crawler/spiders/__init__.py -> build/lib/calaccess_crawler/spiders copying calaccess_crawler/spiders/candidates_spider.py -> build/lib/calaccess_crawler/spiders copying calaccess_crawler/spiders/incumbents_spider.py -> build/lib/calaccess_crawler/spiders copying calaccess_crawler/spiders/propositions_spider.py -> build/lib/calaccess_crawler/spiders running egg_info writing dependency_links to scrapy_calaccess_crawler.egg-info/dependency_links.txt writing scrapy_calaccess_crawler.egg-info/PKG-INFO writing top-level names to scrapy_calaccess_crawler.egg-info/top_level.txt writing requirements to scrapy_calaccess_crawler.egg-info/requires.txt warning: manifest_maker: standard file '-c' not found reading manifest file 'scrapy_calaccess_crawler.egg-info/SOURCES.txt' writing manifest file 'scrapy_calaccess_crawler.egg-info/SOURCES.txt' installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/calaccess_crawler creating build/bdist.linux-armv7l/wheel/calaccess_crawler/spiders copying build/lib/calaccess_crawler/spiders/__init__.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler/spiders copying build/lib/calaccess_crawler/spiders/candidates_spider.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler/spiders copying build/lib/calaccess_crawler/spiders/incumbents_spider.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler/spiders copying build/lib/calaccess_crawler/spiders/propositions_spider.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler/spiders copying build/lib/calaccess_crawler/__init__.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler copying build/lib/calaccess_crawler/settings.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler copying build/lib/calaccess_crawler/proxies.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler copying build/lib/calaccess_crawler/items.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler copying build/lib/calaccess_crawler/pipelines.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler copying build/lib/calaccess_crawler/loaders.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler copying build/lib/calaccess_crawler/middlewares.py -> build/bdist.linux-armv7l/wheel/calaccess_crawler running install_egg_info Copying scrapy_calaccess_crawler.egg-info to build/bdist.linux-armv7l/wheel/scrapy_calaccess_crawler-0.0.1.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/scrapy_calaccess_crawler-0.0.1.dist-info/WHEEL creating '/tmp/pip-wheel-_4t6wst7/scrapy_calaccess_crawler-0.0.1-py3-none-any.whl' and adding '.' to it adding 'calaccess_crawler/__init__.py' adding 'calaccess_crawler/items.py' adding 'calaccess_crawler/loaders.py' adding 'calaccess_crawler/middlewares.py' adding 'calaccess_crawler/pipelines.py' adding 'calaccess_crawler/proxies.py' adding 'calaccess_crawler/settings.py' adding 'calaccess_crawler/spiders/__init__.py' adding 'calaccess_crawler/spiders/candidates_spider.py' adding 'calaccess_crawler/spiders/incumbents_spider.py' adding 'calaccess_crawler/spiders/propositions_spider.py' adding 'scrapy_calaccess_crawler-0.0.1.dist-info/top_level.txt' adding 'scrapy_calaccess_crawler-0.0.1.dist-info/WHEEL' adding 'scrapy_calaccess_crawler-0.0.1.dist-info/METADATA' adding 'scrapy_calaccess_crawler-0.0.1.dist-info/RECORD' removing build/bdist.linux-armv7l/wheel Running setup.py bdist_wheel for scrapy-calaccess-crawler: finished with status 'done' Stored in directory: /tmp/tmpjpwvypw7 Successfully built scrapy-calaccess-crawler Cleaning up... Removing source in /tmp/pip-wheel-g284_n2h/scrapy-calaccess-crawler Removed build tracker '/tmp/pip-req-tracker-f057avh0'