Uploaded image for project: 'MetaBrainz Hosting'
  1. MetaBrainz Hosting
  2. MBH-182

An XML dump/archive/backup of the wiki should be created (optionally on a regular basis)

XMLWordPrintable

    • Icon: Task Task
    • Resolution: Won't Fix
    • Icon: Normal Normal
    • None
    • None

      I'm currently working to archive a few wikis I care about, including the MB wiki. Since we don't have an exposed api.php, this is proving to be a gigantic pain in the butt. Others have asked for this ("this" being an XML dump) as well.

      It's very easy for someone with server access: https://www.mediawiki.org/wiki/Manual:DumpBackup.php

      Specifically: cd to the relevant maintenance directory, then: php dumpBackup.php --full > whatever.xml

      Usually compression is highly effective on these, given they're big XML documents (I've gotten >98% compression levels on other wiki dumps). Then, simply make it available somewhere. Or email it to me (ianmcorvidae@ianmcorvidae.net) and I'll propagate it to various third-party locations like archive.org.

      An image dump would also be great, but probably is larger and/or more cumbersome; however, it amounts to packaging up the images subdirectory of the wiki install (probably excluding the 'deleted' directory). I'd also love this, but the XML would be sufficient in the short term.

      So, minimally: run the dumpBackup.php command I mention above, compress, send it to ianmcorvidae.
      Ideally: set up something to automatically run dumpBackup.php and compress, making the dumps available at a regular location.
      Even more ideally: do the above plus add image dumps as well.

            djce Dave Evans
            ianmcorvidae Ian McEwen
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Created:
              Updated:
              Resolved:

                Version Package