Wednesday, February 24, 2010

Stubborn Data CD Recovery

Have you ever come across a stubborn data CD that just refuses to copy? Even creating an ISO image was failing - both in windows and linux. The Windows PC was not even recognizing the CD, on both Optical Drives!

Had to use a unique combination to recover/copy this data using Linux and some handy commands.

0. Mount CD into drive.

1. Navigate to the CD and copy the data
cp -r /media/* /path/to/copied/files

2. Compare the Original and the copy for any differences
diff -qr /media/ /path/to/copied/files

Got no errors this time, but in case any differences are found, it should be as simple as navigating to the source file and copying it to the backup directory.

Being on ancient hardware meant that this Linux machine did not have a CD burner. The files were copied to a Windows PC to burn onto CD.



Friday, February 19, 2010

In pursuit of a monitoring solution for CentOS Linux

It's about that time where it has become almost necessary to setup a monitoring/analysis solution for the linux servers. The hardware is getting a bit outdated and empirical data speaks for itself when asking for new stuff!

Based on my research, there is no one perfect solution - especially when it comes to Open Source and Linux.

At the very least, the following will need to be monitored:
  • Apache
  • Mysql
  • Network Traffic
  • Disk Space
  • Server Load
  • Memory Utilization
  • Swap usage
  • Processes

In the next few weeks some tests will be done using the following software offerings(in no particular order):
  • cacti
  • munin
  • nagios
  • opennms
  • zabbix
  • zenoss
Which one will be the best fit? Time and Testing will tell.


Wednesday, February 17, 2010

Recovery of a DNS server

There was a DNS server sitting on the network that had been around for ages. Thermal overload and an 'unclean reboot' on an old Red Hat install finally killed it. At least there was failover support!

After some discussions with some colleagues, it was eventually decided that an attempted revival was out of the question.  Rebuild time!

The weapon OS of choice was CentOS (of course), and an attempt was made to salvage whatever data and configurations possible from the old server.

Tomorrow: an update on what was done.

We were luckily able to retrieve the named.conf and the databases from the dead server using a SLAX Live CD.

0. Assuming a Clean install of CentOS and Ethernet / other system configurations complete - log in as root

1. The first thing is to update the files to the latest version
yum update

2. Install the following packages
yum install bind bind-chroot bind-libs bind-utils

It should be noted that the old DNS server did not use the chroot security option, which is essentially a 'jail' to prevent full system access to a hacker using any bind exploit. This meant that the locations of the conf file and the databases needed to be changed for the new install.

3. Rename the original named.conf
mv /var/named/chroot/etc/named.conf /var/named/chroot/etc/named.conf.orig

4. Copy the named.conf which was taken from the old server
cp /location/of//backup/named.conf  /var/named/chroot/etc/

5. Copy the zones
cp /location/of/backup/zones/* /var/named/chroot/var/named/

6. Check to see if the named service is operational
service named restart

Once it starts without problems, proceed to step 7
Troubleshooting? Useful resources here:

7. Set named to automatically start on reboot.
chkconfig named on

8. Configure the firewall
If the server is only being used for DNS, only allow incoming DNS and perhaps SSH/Telnet.

Set firewall to Enabled (*)
Go to Customize
Leave Trusted Devices and Masquerade Devices empty (dependent on your configuration)
Add to "other ports" section
53:tcp 53:udp
Check box by either SSH or Telnet (whichever preferred)

Save and exit

9. Check to see if service is operational
service named status

This setup was based on a restore, so a new install of Bind will need additional tweaking dependent on the environment.

Update: Permissions issue was preventing updates to the server.

10. Further permission fix

cd etc
*Backup named.conf
cp -p named.conf named.conf.bkp

cd /var/named/chroot/var/named
chown named:named db.*

cd /var/named
chown -R named:named ./chroot

chmod g+w /var/named/chroot/var/named


Monday, February 8, 2010

Installing the Eaccelerator cache for php - on CentOS

eAccelerator is an opensource optimizer/cache for php. For a Moodle install, it typically brings down server load (a lot!) by caching frequently requested content. The following setup was used for Centos5.x, but should be similar for other the other Linux flavours.

0. Login as root.

1. Server preparation
The following packages are needed for the eAccelerator install. From the terminal:
yum  -y install php-devel
yum -y groupinstall 'Development Tools'

2. Get the eAccelerator Package
mkdir /temp (if not already created)
cd /temp
tar xvfj eaccelerator-

3. Configure and Install
./configure --with-eaccelerator-shared-memory 
make install

4. Create Cache Directory and set permissions
mkdir /var/cache/eaccelerator
chmod 777 /var/cache/eaccelerator

5. Create the config file
nano /etc/php.d/eaccelerator.ini
Add the following lines:


6. Restart the webserver
service httpd restart

7. Check to see if installed properly
nano /var/www/html/phpinfo.php
Add to file
< php
Save and exit
Now call the script from your internet browser: http://yourservername/phpinfo.php
Look for the Eaccelerator Section
After a short time, the Cached Scripts entry should be > 0.
It works!


Thursday, February 4, 2010

Backing up Moodle Datafiles to a remote server

 As part of any disaster recovery plan, backup of the MoodleData folder is also critical. The critical variable in this equation is Disk Space - there never seems to be enough!

Making regular backups of the entire MoodleData folder is a bit impractical - at least in my scenario, where it is over 100 GB. Instead, I chose to mirror the moodledata folder on a different server, using rsync. This is purely for a disaster recovery scenario and will not consider data recovery on a per-user basis. In an environment of 22000+ users, this is highly impractical.

Assumptions: Another server has already been configured with CentOS, all necessary firewall/SELinux/networking settings have been completed.
In this example, I will use: as the Moodle Server as the Backup Server

0. Login as root on Moodle Server

A "trust" needs to be setup between the Moodle Server and the Backup Server, so that the automated backup does not fail due to password prompts! This is also an added measure of security as well, so that root / sudo passwords are not stored in plain-text in the shell scripts.

1. Setup a trust for password free rsync over ssh
ssh-keygen -t dsa
cat ~/.ssh/ | ssh root@ 'cat >> .ssh/authorized_keys'
Prompted for the password ONCE. Enter it correctly and that's it!

2. Test the password free connection (this will only be one way). From Moodle Server:
ssh -l root
The backup server should be accessible without entering a password!

3. On the Moodle Backup, create a folder on the largest available disk.
mkdir /path/to/mirror

4. On the Moodle Server, create the script that will mirror the moodledata folder.
cd /temp/scripts
chmod +x

5. Edit the script -  log each time the folder is mirrored. Ensure that the folder exists [/temp/logs]
Add to file:

rsync --delete-after -e ssh -avz /path/to/your/moodledata/folder/ root@
date >> /temp/logs/moodledatamirror.log Save and exit

6. Test script
cd /temp/scripts
Some output should be visible and the script will notify when complete.

7. Check to see if the files went across properly!
Login via ssh to Moodle Backup server
ssh -l root
cd /path/to/mirror
ls -all

Compare the results of the list to the original folder on the Moodle Server!

8.Add the script to the crontab to execute nightly
Schedule to run this script at periods of low activity - 2:00am should be fine for most installs. Remember to factor in the Database backup time!

nano /etc/crontab
Append the following:

#database backup script

00 2 * * *  root / /temp/scripts/

The MoodleData files will now be mirrored nightly!
This same strategy can be used to transfer the database backups to the backup server.


Wednesday, February 3, 2010

Restoring a Moodle Database Backup

This is an addendum to my earlier post: Backing Up your Moodle Database

Now you have a database backup. In the event that something horrible happens (knock on wood) and the database needs to be restore...
This assumes that the database is located in the default MySQL installation location: /var/lib/mysql/yourdbname

0. Login as root on the database server

1. Stop the mysqld service
service mysqld stop

2. Backup the current corrupted database (in case forensic analysis is needed later)
cp /var/lib/mysql/yourdbname /var/lib/mysql/yourdbname_backup

3. Empty the files from the database.
cd /var/lib/mysql/yourdbname
rm -rf *

4. Copy the backup to the empty database shell
cp /path/to/your/backup/* /var/lib/mysql/yourdbname/
*make sure that the folder /var/lib/mysql/yourdbname/ only contains MYI, MYD, frm and opt files ONLY. 

5.  Ensure that permissions are correctly set.
cd /var/lib/mysql/yourdbname/
chown mysql:mysql *
chmod 771 *

6. Start the Mysql Service.
service mysqld start

7. Run a Check/Repair/Optimize on the restored database.
This is to ensure consistency of the data you just restored.
There is no need to login to mysql, simply execute from the terminal.

mysqlcheck yourdbname -c -o -a -r -f -h localhost -u yourdbusername -p yourdbpassword

Note: If the database for the Moodle install is large,  a weekly or nightly optimization can be configured to minimize the chances of irrepairable corruption.
This can be done similarly to the cron job in my earlier post.

7.1 Create script
cd /temp/scripts
chmod +x
7.2. Edit the script
Add the following (I chose to log each time the script runs)

mysqlcheck yourdbname -c -o -a -r -f -h localhost -u yourdbusername -p yourdbpassword > /temp/logs/database_check_$(date +%Y%m%d)_$(date +%H%M).txt
The log folder needs to be created so the script does not throw an error!
cd /temp
mkdir logs
7.3 Test

check the latest logfile for output
cat /temp/logs/database_check_xxxxxx.txt
You should see some output with the table names and status next to it - e.g
mdl_config      OK


Backing up Moodle Database

This particular utility is only applicable to Mysql Databases using the MYISAM storage engine. I could have used the mysqldump utility, but I prefer the mysqlhotcopy tool as it executes much faster on larger databases with less overheads.

So the Moodle install is up and running. Time for a simple database backup using the Mysqlhotcopy tool. It allows for literally "hot" copying of the database without stopping the service!

This can be shell scripted to run nightly.

0. Login as root.

1. Make a script directory and create a bare shell script
mkdir /temp/scripts && cd /temp/scripts
chmod +x

2. Edit the script using VI or nano (I <3 nano)
cd /temp/scripts
Add lines (edit to include your database name, username, password etc). Since the database password will be stored in the file as plaintext, make sure the folder is owned by root. If preferred, the date can be appeded to the end of the foldername for simple identification.
mysqlhotcopy -u yourdbusername -p yourdbpassword --addtodest yourdbname /path/to/backup/folder
mv /path/to/backup/folder /path/to/backup/folder_$(date +%Y%m%d)_(date +%H%M)

Save and exit

3. Test the script to make sure it works as advertised!
cd /temp/scripts

Once it runs without errors, it can be set to run automatically via cron.

4. Add the script to the crontab to execute nightly
I prefer to run this script at periods of low activity - 1:00am should be fine for most installs.

nano /etc/crontab
Append the following:

#database backup script

00 1 * * *  root / /temp/scripts/

This tutorial demonstrates how to make a copy of  raw database files on the same server that the database resides on.

Later on (when I get some time), I  will show how to extend the strategy to compress the backup and transfer the files to another server - as well as how to do a Database restore from the raw files.


Tuesday, February 2, 2010

Moodle Installation on CentOS 5.x

This is for a single server Moodle Install.

Assumption: A base install of CentOS 5.x is completed, and mysqld and httpd have already been installed.

1. Server preparation - Login as root or superuser
yum install php* php-mysql httpd* mysql*
All the packages are not neccessary, but if you are going to install any plugins, they may be needed.

2. Make sure Mysql and Http are started and can start automatically on reboot.
service mysqld start
service httpd start
chkconfig httpd on
chkconfig mysqld on

3. Assign a root password to mysql
/usr/bin/mysqladmin -u root password 'yourpasswordhere'
It is recommended to set a secure root password!

4. Login to Mysql, Create an empty database and assign a user for moodle.
mysql -u root -p yourpasswordhere
At the mysql prompt:

GRANT select,insert,update,delete,create,drop,index,alter 
ON mydbname.*
TO mymoodleuser@localhost IDENTIFIED BY 'moodleuserpassword';

flush privileges;


5. Download moodle and extract to webserver root.
cd /tmp
[check for latest version!!!]
tar xzf moodle-weekly-19.tgz 
cd moodle-weekly-19
cp -r * /var/www/html/
chown -R root:root /var/www/html/

6. Create Moodledata folder [not in webserver root!]
mkdir /usr/moodledata
cd /usr/moodledata
chown -R apache:apache /usr/moodledata

Note: I use the paranoid file permissions as recommended on
See Paraniod Moodle Permissions!

7. Setup your config.php
Please see here for further details.

8. Edit Apache config file 
I prefer nano, not VIM
nano /etc/httpd/conf/httpd.conf
Add lines to end of file:
[Directory "/usr/moodle/mymoodle"]*
DirectoryIndex index.php
AcceptPathInfo on
AllowOverride None
Options None
Order allow,deny
Allow from all

*substitute the [] with <> !!

9. Setup the Moodle Cron job.
nano /etc/crontab
add line
*/5 * * * * /usr/bin/wget -O /dev/null http://localhost/admin/cron.php

10. Open up a web browser on the server and hit: http://localhost/admin

Happy Moodling!


Monday, February 1, 2010

First and foremost...

I need to start collecting my notes, which seem to be scattered all over the place.
A memory stick here, a disconnected IDE drive there, perhaps even on one of (many!) desktops that have to deal with my nonsense and last but not least, the notepads that seem to have more doodling than actual writing.