Full-featured web UI for monitoring and controlling your Scrapyd servers
Project description
ScrapydWeb: Full-featured web UI for monitoring and controlling Scrapyd servers
Feature Support
-
Multinode Scrapyd Servers
- Group, filter and select any numbers of nodes
- Execute command on multinodes with one click
-
Scrapy Log Analysis
- Collect statistics
- Show crawling progress with chart
- Extract key logs
-
All Scrapyd API supported
- Deploy project, Run Spider, Stop job
- List projects/versions/spiders/running_jobs
- Delete version/project
Maintainer
Installation
To install ScrapydWeb, simply use pip:
$ pip install scrapydweb
Start Up
Run "scrapydweb -h" to get help, and a config file named "scrapydweb_settings.py" would be copied to the working directory, then you can custom config with it
$ scrapydweb
Visit http://127.0.0.1:5000
Screenshot
- Overview
- Dashboard
- Manage Projects
- Run Spider
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
scrapydweb-0.9.1.tar.gz
(573.3 kB
view hashes)
Built Distribution
scrapydweb-0.9.1-py3-none-any.whl
(596.8 kB
view hashes)
Close
Hashes for scrapydweb-0.9.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0347c82061792055630243ce82485c7edbac643f4d17119e4f8aecccfbf6f4c7 |
|
MD5 | 59b9e400ca021ccce7a46e4e267ef03a |
|
BLAKE2b-256 | 1d0c761ca6cac41af6e3671f8942889eaf8c82e6e35b59e596e0798457d55cbb |