Skip to main content

Tools for downloading and preserving MediaWikis. We archive MediaWikis, from Wikipedia to tiniest wikis.

Project description

wikiteam3

Countless MediaWikis are still waiting to be archived.

Image by @gledos

wikiteam3 is a fork of mediawiki-scraper.

Why we fork mediawiki-scraper

Originally, mediawiki-scraper was named wikiteam3, but wikiteam upstream (py2 version) suggested that the name should be changed to avoid confusion with the original wikiteam.
Half a year later, we didn't see any py3 porting progress in the original wikiteam, and mediawiki-scraper lacks "code" reviewers.
So, we decided to break that suggestion, fork and named it back to wikiteam3, put the code here, and release it to pypi wildly.

Everything still under GPLv3 license.

Installation

pip install wikiteam3 --upgrade

Dumpgenerator usage

Downloading a wiki with complete XML history and images

wikiteam3dumpgenerator http://wiki.domain.org --xml --images

Manually specifying api.php and/or index.php

If the script can't find itself the api.php and/or index.php paths, then you can provide them:

wikiteam3dumpgenerator --api http://wiki.domain.org/w/api.php --xml --images
wikiteam3dumpgenerator --api http://wiki.domain.org/w/api.php --index http://wiki.domain.org/w/index.php \
    --xml --images

If you only want the XML histories, just use --xml. For only the images, just --images. For only the current version of every page, --xml --curonly.

Resuming an incomplete dump

wikiteam3dumpgenerator \
    --api http://wiki.domain.org/w/api.php --xml --images --resume --path /path/to/incomplete-dump

In the above example, --path is only necessary if the download path is not the default.

wikiteam3dumpgenerator will also ask you if you want to resume if it finds an incomplete dump in the path where it is downloading.

Using wikiteam3uploader

Requirements

  • unbinded port 62954

  • 3GB+ RAM (~2.56GB for commpressing)

  • 64-bit OS (required by 2G wlog size)

  • 7z (7z-full with lzma2)

  • zstd 1.5.5+ (recommended), v1.5.0-v1.5.4(DO NOT USE), 1.4.8 (minimum)

Uploader usage

wikiteam3uploader {YOUR_WIKI_DUMP_PATH}

Checking dump integrity

TODO: xml2titles.py

If you want to check the XML dump integrity, type this into your command line to count title, page and revision XML tags:

grep -E '<title(.*?)>' *.xml -c;grep -E '<page(.*?)>' *.xml -c;grep \
    "</page>" *.xml -c;grep -E '<revision(.*?)>' *.xml -c;grep "</revision>" *.xml -c

You should see something similar to this (not the actual numbers) - the first three numbers should be the same and the last two should be the same as each other:

580
580
580
5677
5677

If your first three numbers or your last two numbers are different, then, your XML dump is corrupt (it contains one or more unfinished </page> or </revision>). This is not common in small wikis, but large or very large wikis may fail at this due to truncated XML pages while exporting and merging. The solution is to remove the XML dump and re-download, a bit boring, and it can fail again.

Contributors

WikiTeam is the Archive Team [GitHub] subcommittee on wikis. It was founded and originally developed by Emilio J. Rodríguez-Posada, a Wikipedia veteran editor and amateur archivist. Thanks to people who have helped, especially to: Federico Leva, Alex Buie, Scott Boyd, Hydriz, Platonides, Ian McEwen, Mike Dupont, balr0g and PiRSquared17.

Mediawiki-Scraper The Python 3 initiative is currently being led by Elsie Hupp, with contributions from Victor Gambier, Thomas Karcher, Janet Cobb, yzqzss, NyaMisty and Rob Kam

WikiTeam3 None yet.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wikiteam3-4.1.3.tar.gz (179.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

wikiteam3-4.1.3-py3-none-any.whl (101.0 kB view details)

Uploaded Python 3

File details

Details for the file wikiteam3-4.1.3.tar.gz.

File metadata

  • Download URL: wikiteam3-4.1.3.tar.gz
  • Upload date:
  • Size: 179.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.1 CPython/3.10.12 Linux/5.15.0-84-generic

File hashes

Hashes for wikiteam3-4.1.3.tar.gz
Algorithm Hash digest
SHA256 82a4078d3a7b88db61a408083b16b51b2d24c017dd9c006c14bd1375972bc579
MD5 a9c8d8ab8fa149feb90db2396f0107e0
BLAKE2b-256 1242034d654e46514221eae5164f13cbc2e02b3b1b1d6f37e2222cf98821d978

See more details on using hashes here.

File details

Details for the file wikiteam3-4.1.3-py3-none-any.whl.

File metadata

  • Download URL: wikiteam3-4.1.3-py3-none-any.whl
  • Upload date:
  • Size: 101.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.1 CPython/3.10.12 Linux/5.15.0-84-generic

File hashes

Hashes for wikiteam3-4.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 4d477783285d1119b4ec9a9f6ee11fabc6aff1a1f22f471a7d21933af3bace96
MD5 ee099489b19e2d23ff2681e77948065b
BLAKE2b-256 45177aa660e64bbf6c245a21268d6a1a2042773cbf4322f96e959ed8a31070ce

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page