Database backups
- Database Backups
This article details the procedures and configurations for performing regular database backups of your MediaWiki 1.40 installation. Consistent and reliable backups are critical for disaster recovery and ensuring the longevity of your wiki's data. This guide assumes you have basic system administration knowledge and access to the server's command line.
Understanding Backup Strategies
There are several approaches to database backups. The most common are:
- Full Backups: Copies the entire database. These are the simplest to restore but take the longest to create and require the most storage space.
- Incremental Backups: Copies only the changes made since the *last* full or incremental backup. Faster and smaller than full backups, but restoration requires the full backup *and* all subsequent incremental backups.
- Differential Backups: Copies only the changes made since the *last* full backup. Larger than incremental backups, but restoration only requires the full backup and the latest differential backup.
For MediaWiki, we will focus on full backups due to their simplicity and reliability, supplemented by occasional incremental backups for faster daily routines.
Backup Methods
The primary method for backing up the MediaWiki database is using the database server's native backup tools. This guide covers the most common database systems: MySQL/MariaDB and PostgreSQL. It’s crucial to understand your database server’s specific tools and options for optimal performance and reliability.
MySQL/MariaDB Backups
The `mysqldump` utility is the standard tool for backing up MySQL/MariaDB databases.
Here's a basic example command:
```bash mysqldump -u [username] -p[password] [database_name] > /path/to/backup/mediawiki_backup.sql ```
Replace `[username]`, `[password]`, and `[database_name]` with your actual database credentials and name. `/path/to/backup/mediawiki_backup.sql` is the desired location and filename for the backup.
However, for larger databases, consider using options like `--single-transaction` for consistent snapshots and `--compress` to reduce file size.
Here's a table detailing common `mysqldump` options:
Option | Description |
---|---|
`--user=[username]` | Specifies the MySQL username. |
`--password=[password]` | Specifies the MySQL password. (Caution: Avoid storing passwords directly in scripts.) |
`--databases=[database_name]` | Specifies the database to back up. |
`--single-transaction` | Creates a consistent snapshot of the database. Recommended for InnoDB tables. |
`--compress` | Compresses the output, reducing file size. |
`--quick` | Dumps the data row by row, avoiding buffering the entire result set in memory. |
`--lock-tables=false` | Prevents table locking during the dump. Use with caution, potentially leading to inconsistent backups. |
PostgreSQL Backups
The `pg_dump` utility is used for backing up PostgreSQL databases.
A basic example command is:
```bash pg_dump -U [username] -d [database_name] -f /path/to/backup/mediawiki_backup.sql ```
Replace `[username]` and `[database_name]` with your actual credentials and name. `/path/to/backup/mediawiki_backup.sql` is the desired backup file location.
PostgreSQL offers various backup formats (plain text, custom, directory). The custom format (`-Fc`) allows for parallel backups and restores, which can significantly speed up the process.
Here's a table of common `pg_dump` options:
Option | Description |
---|---|
`-U [username]` | Specifies the PostgreSQL username. |
`-d [database_name]` | Specifies the database to back up. |
`-f [filename]` | Specifies the output file. |
`-Fc` | Uses the custom archive format, enabling parallel backups/restores. |
`-j [number]` | Specifies the number of parallel jobs to use with the custom format. |
`-v` | Verbose mode, providing more output during the backup process. |
Backup Scheduling and Rotation
Automating backups is essential. Use a scheduling tool like Cron to run backup scripts regularly.
Here’s an example of a Cron job entry to run a full MySQL backup daily at 2:00 AM:
``` 0 2 * * * /path/to/backup_script.sh ```
A proper backup rotation strategy is crucial to manage storage space and ensure you have multiple recovery points. Consider keeping:
- Daily full backups for the past week.
- Weekly full backups for the past month.
- Monthly full backups for the past year.
Here’s a table outlining a sample backup rotation scheme:
Backup Type | Frequency | Retention Period |
---|---|---|
Full Backup | Daily | 7 days |
Full Backup | Weekly | 4 weeks |
Full Backup | Monthly | 12 months |
Verification and Restoration
Regularly *test* your backups to ensure they are valid and can be restored. Attempt a full restoration to a test environment to verify the process. This is perhaps the most important step!
- **MySQL/MariaDB:** Use `mysql -u [username] -p [database_name] < /path/to/backup/mediawiki_backup.sql` to restore.
- **PostgreSQL:** Use `psql -U [username] -d [database_name] -f /path/to/backup/mediawiki_backup.sql` to restore.
Remember to also back up your MediaWiki installation directory (including `config/`, `images/`, and any extensions) along with the database. This directory contains essential configuration files and uploaded media. See File storage for more details.
Related Pages
- Configuration settings
- Database setup
- File storage
- Security best practices
- Cron (Unix)
- System administration
- MediaWiki extensions
- Maintenance tasks
- Troubleshooting
- Performance tuning
Intel-Based Server Configurations
Configuration | Specifications | Benchmark |
---|---|---|
Core i7-6700K/7700 Server | 64 GB DDR4, NVMe SSD 2 x 512 GB | CPU Benchmark: 8046 |
Core i7-8700 Server | 64 GB DDR4, NVMe SSD 2x1 TB | CPU Benchmark: 13124 |
Core i9-9900K Server | 128 GB DDR4, NVMe SSD 2 x 1 TB | CPU Benchmark: 49969 |
Core i9-13900 Server (64GB) | 64 GB RAM, 2x2 TB NVMe SSD | |
Core i9-13900 Server (128GB) | 128 GB RAM, 2x2 TB NVMe SSD | |
Core i5-13500 Server (64GB) | 64 GB RAM, 2x500 GB NVMe SSD | |
Core i5-13500 Server (128GB) | 128 GB RAM, 2x500 GB NVMe SSD | |
Core i5-13500 Workstation | 64 GB DDR5 RAM, 2 NVMe SSD, NVIDIA RTX 4000 |
AMD-Based Server Configurations
Configuration | Specifications | Benchmark |
---|---|---|
Ryzen 5 3600 Server | 64 GB RAM, 2x480 GB NVMe | CPU Benchmark: 17849 |
Ryzen 7 7700 Server | 64 GB DDR5 RAM, 2x1 TB NVMe | CPU Benchmark: 35224 |
Ryzen 9 5950X Server | 128 GB RAM, 2x4 TB NVMe | CPU Benchmark: 46045 |
Ryzen 9 7950X Server | 128 GB DDR5 ECC, 2x2 TB NVMe | CPU Benchmark: 63561 |
EPYC 7502P Server (128GB/1TB) | 128 GB RAM, 1 TB NVMe | CPU Benchmark: 48021 |
EPYC 7502P Server (128GB/2TB) | 128 GB RAM, 2 TB NVMe | CPU Benchmark: 48021 |
EPYC 7502P Server (128GB/4TB) | 128 GB RAM, 2x2 TB NVMe | CPU Benchmark: 48021 |
EPYC 7502P Server (256GB/1TB) | 256 GB RAM, 1 TB NVMe | CPU Benchmark: 48021 |
EPYC 7502P Server (256GB/4TB) | 256 GB RAM, 2x2 TB NVMe | CPU Benchmark: 48021 |
EPYC 9454P Server | 256 GB RAM, 2x2 TB NVMe |
Order Your Dedicated Server
Configure and order your ideal server configuration
Need Assistance?
- Telegram: @powervps Servers at a discounted price
⚠️ *Note: All benchmark scores are approximate and may vary based on configuration. Server availability subject to stock.* ⚠️