Skip to main content

download image from Baidu Image

Project description

BaiduImagesDownload

Python package codecov Codacy Badge

BaiduImagesDownload是一个快速、简单百度图片爬取工具

from BaiduImagesDownload.crawler import Crawler

net, num, urls = Crawler.get_images_url('二次元', 20)
Crawler.download_images(urls)

目录

安装

pip install BaiduImagesDownload

使用

基本

from BaiduImagesDownload.crawler import Crawler

net, num, urls = Crawler.get_images_url('二次元', 20)
Crawler.download_images(urls)

设置图片格式

from BaiduImagesDownload.crawler import Crawler

# rule默认为('.png', '.jpg')
net, num, urls = Crawler.get_images_url('二次元', 20)
Crawler.download_images(urls, rule=('.png', '.jpg'))

设置timeout

from BaiduImagesDownload.crawler import Crawler

# timeout默认为60(s)
net, num, urls = Crawler.get_images_url('二次元', 20, timeout=60)
Crawler.download_images(urls, rule=('.png', '.jpg'), timeout=60)

文档

get_images_url

class Crawler:

    @staticmethod
    def get_images_url(word: str, num: int, timeout: int = __CONCURRENT_TIMEOUT) -> (bool, bool, list):

参数

  • word: str: 搜索关键词
  • num: int: 搜索数量
  • timeout: int: 请求 timeout, 默认为60(s)

返回

  • net: bool: 网络连接是否成功,成功为 True,失败为 False
  • num: bool: 图片数量是否满足,满足为 True,不足为 False
  • urls: list: 获取的 urls,每项为一个dict,其中有两个键obj_urlfrom_urlobj_url为对应图片的urlfrom_urlReferer

download_images

class Crawler:

    @staticmethod
    def download_images(urls: list, rule: tuple = ('.png', '.jpg'),
                        path: str = 'download', timeout: int = __CONCURRENT_TIMEOUT,
                        concurrent: int = __CONCURRENT_NUM) -> (int, int):

参数

  • urls: list: 需要爬的图片列表,格式与get_images_url返回的相同
  • rule: tuple, optional: 允许下载的格式,默认为('.png', '.jpg')
  • path: str, optional: 图片下载的路径,默认为'download'
  • timeout: int, optional: 请求 timeout, 默认为60(s)
  • concurrent: int, optional: 并行下载的数量,默认为100

返回

  • success: int: 下载成功的数量
  • failed: int: 下载失败的数量

日志

可以设置日志的等级以及输出,具体请查看logging

import logging
from BaiduImagesDownload.crawler import logging

# 设置日志的等级为DEBUG
# 默认为INFO
logger.setLevel(logging.DEBUG)

# 设置输出到文件
file_handler = logging.FileHandler('~/BaiduImagesDownload.log')
file_handler.setFormatter(logging.Formatter(
    '[%(asctime)s] [%(levelname)s] %(message)s')) # 设置输出格式
logger.addHandler(file_handler)

许可

License: MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

BaiduImagesDownload-1.0.2.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

BaiduImagesDownload-1.0.2-py3-none-any.whl (7.4 kB view details)

Uploaded Python 3

File details

Details for the file BaiduImagesDownload-1.0.2.tar.gz.

File metadata

  • Download URL: BaiduImagesDownload-1.0.2.tar.gz
  • Upload date:
  • Size: 6.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.8.2

File hashes

Hashes for BaiduImagesDownload-1.0.2.tar.gz
Algorithm Hash digest
SHA256 774ce0433c091872351271520c43bad3ac560a06e1531cf3853a093b83adcade
MD5 25d5e91a62a1a618bd84339f491c2047
BLAKE2b-256 c86f6caed3eae5e477783739e30fa4de63a0862dd7fdbd61bebe9ec51ed5fd56

See more details on using hashes here.

File details

Details for the file BaiduImagesDownload-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: BaiduImagesDownload-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 7.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.8.2

File hashes

Hashes for BaiduImagesDownload-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 801d57b55a6c0e47de778ef5f55af6cb4733c948a66ef68aeaf1d5d328a576f5
MD5 182adb262bbf730688f3c9604e6c2984
BLAKE2b-256 981ed6077812217166ebe0407c5908ded86640904bc47d125fa65c46ac6dc22c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page