Collecting pyspark_db_utils==0.0.5 1 location(s) to search for versions of pyspark-db-utils: * https://pypi.python.org/simple/pyspark-db-utils/ Getting page https://pypi.python.org/simple/pyspark-db-utils/ Analyzing links from page https://pypi.python.org/simple/pyspark-db-utils/ Found link https://pypi.python.org/packages/19/d2/ba25666ceefc03abf356f08c1cf185aa2e76dc039455fd46a3dc6f0f69bb/pyspark_db_utils-0.0.4.tar.gz#md5=540c0006132752114be456160ea4f3f9 (from https://pypi.python.org/simple/pyspark-db-utils/), version: 0.0.4 Found link https://pypi.python.org/packages/c2/2a/3088de4e8f1507da8235a73d5c59edb82d6be49232cb2843a4537a1af06a/pyspark_db_utils-0.0.5.tar.gz#md5=412b8b5f695e5cffb3409cedc4b5d809 (from https://pypi.python.org/simple/pyspark-db-utils/), version: 0.0.5 Found link https://pypi.python.org/packages/dd/22/c029444468c9bb92a78a75ab1275e3d0a23c5cf9459831227db54450ae63/pyspark_db_utils-0.0.1.tar.gz#md5=40e29c66c9fe47a4e2fd633cd3fbeaa8 (from https://pypi.python.org/simple/pyspark-db-utils/), version: 0.0.1 Found link https://pypi.python.org/packages/e7/4c/38ffe31a254a31104820be35891c56fcab5b6f42724a962cb8421b097991/pyspark_db_utils-0.0.3.tar.gz#md5=5319f69ed85309bd8b8caf07fcb7e3ed (from https://pypi.python.org/simple/pyspark-db-utils/), version: 0.0.3 Using version 0.0.5 (newest of versions: 0.0.5) Downloading pyspark_db_utils-0.0.5.tar.gz Downloading from URL https://pypi.python.org/packages/c2/2a/3088de4e8f1507da8235a73d5c59edb82d6be49232cb2843a4537a1af06a/pyspark_db_utils-0.0.5.tar.gz#md5=412b8b5f695e5cffb3409cedc4b5d809 (from https://pypi.python.org/simple/pyspark-db-utils/) Running setup.py (path:/tmp/pip-build-ujii_84q/pyspark-db-utils/setup.py) egg_info for package pyspark-db-utils Running command python setup.py egg_info running egg_info creating pip-egg-info/pyspark_db_utils.egg-info writing requirements to pip-egg-info/pyspark_db_utils.egg-info/requires.txt writing pip-egg-info/pyspark_db_utils.egg-info/PKG-INFO writing dependency_links to pip-egg-info/pyspark_db_utils.egg-info/dependency_links.txt writing top-level names to pip-egg-info/pyspark_db_utils.egg-info/top_level.txt writing manifest file 'pip-egg-info/pyspark_db_utils.egg-info/SOURCES.txt' reading manifest file 'pip-egg-info/pyspark_db_utils.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' writing manifest file 'pip-egg-info/pyspark_db_utils.egg-info/SOURCES.txt' Source in /tmp/pip-build-ujii_84q/pyspark-db-utils has version 0.0.5, which satisfies requirement pyspark_db_utils==0.0.5 from https://pypi.python.org/packages/c2/2a/3088de4e8f1507da8235a73d5c59edb82d6be49232cb2843a4537a1af06a/pyspark_db_utils-0.0.5.tar.gz#md5=412b8b5f695e5cffb3409cedc4b5d809 Building wheels for collected packages: pyspark-db-utils Running setup.py bdist_wheel for pyspark-db-utils: started Destination directory: /tmp/tmpqmbfr0mbpip-wheel- Running command /usr/bin/python3 -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-ujii_84q/pyspark-db-utils/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" bdist_wheel -d /tmp/tmpqmbfr0mbpip-wheel- running bdist_wheel running build running build_py creating build creating build/lib creating build/lib/pyspark_db_utils copying pyspark_db_utils/pg.py -> build/lib/pyspark_db_utils copying pyspark_db_utils/__init__.py -> build/lib/pyspark_db_utils copying pyspark_db_utils/example.py -> build/lib/pyspark_db_utils creating build/lib/pyspark_db_utils/ch copying pyspark_db_utils/ch/smart_ch_fillna.py -> build/lib/pyspark_db_utils/ch copying pyspark_db_utils/ch/make_ch_model_for_df.py -> build/lib/pyspark_db_utils/ch copying pyspark_db_utils/ch/read_from_ch.py -> build/lib/pyspark_db_utils/ch copying pyspark_db_utils/ch/write_to_ch.py -> build/lib/pyspark_db_utils/ch copying pyspark_db_utils/ch/__init__.py -> build/lib/pyspark_db_utils/ch creating build/lib/pyspark_db_utils/utils copying pyspark_db_utils/utils/drop_columns.py -> build/lib/pyspark_db_utils/utils copying pyspark_db_utils/utils/__init__.py -> build/lib/pyspark_db_utils/utils running egg_info writing top-level names to pyspark_db_utils.egg-info/top_level.txt writing dependency_links to pyspark_db_utils.egg-info/dependency_links.txt writing pyspark_db_utils.egg-info/PKG-INFO writing requirements to pyspark_db_utils.egg-info/requires.txt reading manifest file 'pyspark_db_utils.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' writing manifest file 'pyspark_db_utils.egg-info/SOURCES.txt' installing to build/bdist.linux-armv7l/wheel running install running install_lib creating build/bdist.linux-armv7l creating build/bdist.linux-armv7l/wheel creating build/bdist.linux-armv7l/wheel/pyspark_db_utils copying build/lib/pyspark_db_utils/pg.py -> build/bdist.linux-armv7l/wheel/pyspark_db_utils creating build/bdist.linux-armv7l/wheel/pyspark_db_utils/ch copying build/lib/pyspark_db_utils/ch/smart_ch_fillna.py -> build/bdist.linux-armv7l/wheel/pyspark_db_utils/ch copying build/lib/pyspark_db_utils/ch/make_ch_model_for_df.py -> build/bdist.linux-armv7l/wheel/pyspark_db_utils/ch copying build/lib/pyspark_db_utils/ch/read_from_ch.py -> build/bdist.linux-armv7l/wheel/pyspark_db_utils/ch copying build/lib/pyspark_db_utils/ch/write_to_ch.py -> build/bdist.linux-armv7l/wheel/pyspark_db_utils/ch copying build/lib/pyspark_db_utils/ch/__init__.py -> build/bdist.linux-armv7l/wheel/pyspark_db_utils/ch creating build/bdist.linux-armv7l/wheel/pyspark_db_utils/utils copying build/lib/pyspark_db_utils/utils/drop_columns.py -> build/bdist.linux-armv7l/wheel/pyspark_db_utils/utils copying build/lib/pyspark_db_utils/utils/__init__.py -> build/bdist.linux-armv7l/wheel/pyspark_db_utils/utils copying build/lib/pyspark_db_utils/__init__.py -> build/bdist.linux-armv7l/wheel/pyspark_db_utils copying build/lib/pyspark_db_utils/example.py -> build/bdist.linux-armv7l/wheel/pyspark_db_utils running install_egg_info Copying pyspark_db_utils.egg-info to build/bdist.linux-armv7l/wheel/pyspark_db_utils-0.0.5-py3.4.egg-info running install_scripts creating build/bdist.linux-armv7l/wheel/pyspark_db_utils-0.0.5.dist-info/WHEEL Running setup.py bdist_wheel for pyspark-db-utils: finished with status 'done' Stored in directory: /tmp/tmph1777epw Successfully built pyspark-db-utils Cleaning up... Removing source in /tmp/pip-build-ujii_84q/pyspark-db-utils