Basic introduction to backing up The Psychology Wiki

Download and Backup information.

Quoting from the wickicity download page. A good reason to use Wikicities is that a MySQL database of every Wikicity is available for download under the terms of the GFDL. The database is available as a dump of the public SQL tables (this does not include private user data) and thus can be integrated into another SQL table (preferably one which mediawiki has already been installed in). To access the download, go to: http://name of your wiki.wikicities.com/dbdumps/cur_table.sql.bz2 for the content of the "current" table (the most recent edit to every page), or to: http://name of your wiki.wikicities.com/dbdumps/old_table.sql.bz2 for the old table (every edit except the most recent).

The downloading of the SQL file is straightforward as described in: http://www.wikicities.com/wiki/Database_download

I have downloaded the whole of the 22,000 (218MB) pages of the star wars site in a just over an hour. It came down as a 152MB zip file over my cable at approx 45kb/sec It took just over four minutes to unzip

No privileged access was required.

The software to run the program is available to download for free at http://www.mediawiki.org/wiki/Download

To install MediaWiki, you will need any operating system (GNU/Linux is suggested), any web server (apache2 is suggested), PHP ≥ 4.3 (PHP 4.3.11 is suggested) and MySQL ≥ 3.23 (MySQL 4.0.23 is suggested). You do not need a specific software package to manage MySQL but such tool may help you for backups and other tasks (phpMyAdmin ≥ 2.6.1 is suggested).MediaWiki ships with an installation utility which will enable you to install it using your web browser. If you do not operate your own server and have only access to one database, mind that MediaWiki is able to share its database with other applications. You don't need to know SQL, HTML, XHTML or CSS.Basic knowledge of PHP is required if you want to customize MediaWiki to your special needs.

It seems prudent and unproblematic to institute a regular backup routine by a team of dedicated people say ten people doing it pairs. This would cover 5 days a week if they took one day each.. The pair gives redundancy for most of the year and they can cover during holidays by agreement if substitutes can’t be found.

This procedure would provide us with the capacity to reconstruct the site if anything untoward happened to the main site. Although I am sure they will be mirroring off site soon

However I personally intend to backup the site every evening at 11.00pm GMT for my own piece of mind

From the site again: Backups Backups of all Wikicities databases are saved to a file on the same hardware and copied (automatically) to a second server. Therefore, the backups exist on two different machines. Both machines are in the same colocation facility, so it's not quite "off-site". [edit] Automatic script This script may be used to download the backups, but it can be evolved:
 * 1) !/bin/sh


 * 1) Exit on any error
 * 1) Exit on any error

set -e NAME="your_wiki_name" BACKUP_NEW="new_backup" BACKUP_OLD="old_backup"
 * 1) The name of our wiki and our current/old backup directories
 * 1) The name of our wiki and our current/old backup directories

if test -d ${BACKUP_OLD} then rm -rf old_backup fi

if test -d ${BACKUP_NEW} then mv ${BACKUP_NEW} ${BACKUP_OLD} fi

mkdir ${BACKUP_NEW} cd ${BACKUP_NEW}

wget http://${NAME}.wikicities.com/dbdumps/cur_table.sql.bz2 wget http://${NAME}.wikicities.com/dbdumps/old_table.sql.bz2 wget http://${NAME}.wikicities.com/dbdumps/brokenlinks_table.sql.bz2 wget http://${NAME}.wikicities.com/dbdumps/categorylinks_table.sql.bz2 wget http://${NAME}.wikicities.com/dbdumps/imagelinks_table.sql.bz2 wget http://${NAME}.wikicities.com/dbdumps/image_table.sql.bz2 wget http://${NAME}.wikicities.com/dbdumps/links_table.sql.bz2 wget http://${NAME}.wikicities.com/dbdumps/oldimage_table.sql.bz2 wget http://${NAME}.wikicities.com/dbdumps/site_stats_table.sql.bz2

du -sh cd .. du -sh