Gossamer Forum
Home : Products : Gossamer Links : Version 1.x :

backup tactic ?

Quote Reply
backup tactic ?
Hello all,

I wanna make a good backup of 1 database (all table) in the mysql database. The database and backup
is for a client. I want to make a easy way for him to backup all his data in the mysql database.

So i am thinking not (?) to use the data (ascii) in the backup dir, but maybe use the dump command (like in mysqlman).

What is a smart way to do this (what to add in the crontab ?) so the customer can grab 1 file and that's his whole database, or do you suggest a other way arround.

I am curious how you all make a backup and why that way ?

Allready thanks.

Regards Startpoint.

Quote Reply
Re: backup tactic ? In reply to
As I believe I mentioned before...here is what you can do:

1) Add additional lines of codes in the sub build_backup routine that makes individual backups of all the tables in the database.

2) Compress all the files into a tar.gz file that your client can download and then expand in his/her personal computer to restore tables when needed.

3) Use a separate script that EXPORTS all tables from your database...although why re-create the wheel when this is already available through the nph-build.cgi script.

Regards.

Eliot Lee

Quote Reply
Re: backup tactic ? In reply to
Thanks Eliot,

Well you told me before you are right. But i was thinking is there a 'better' way.
I quess i have to dump the structure (without data) into a file. So if we need to create a new database
it will be easy. Then let them backup the files from admin/backup. I only need the files Links and Categories the rest (like search_log) i don't need.

The files in the backup dir are that good file to restore from ? Like the new-line sign and so on ?

Allready thanks.

Regards Startpoint.

Quote Reply
Re: backup tactic ? In reply to
I don't really think that DUMPING all data from the tables on a periodic basis will be the most effective in terms of restoring tables.

Again...if you back-up all the tables, it will be easier to restore from files in the backup directory.

Regards,

Eliot Lee

Quote Reply
Re: backup tactic ? In reply to
Thanks Eliot...

Ok i go for backup from the backup directory because i thrust you:)

Nah.... was just thinking what the best method was and maybe you are right. But i will first make 1 dump file with all table in it. So when i have to restore the data. I can make a new table if needed and then restore 2 files Links and Category rest i don't need i quess..... But how to restore that 2 files ? with editor.cgi ? or use myslqman ?

Well thanks...

Regards Startpoint.

Quote Reply
Re: backup tactic ? In reply to
The better application IMHO for restoring tables/databases is MySQLMan, but you can do the same functions of IMPORT in either MySQLMan or editor.cgi...

So...it is up to you.

BTW: The only time that I would ponder dumping data from the tables is transferring accounts to a new server...but daily backups would be the better solution in terms of restoring data IMHO.

Regards,

Eliot Lee

Quote Reply
Re: backup tactic ? In reply to
MySQL is not a system like Oracle.

If you need (or want) to make backups to another disk or to backup media, you do _not_ have to shut down the database. The on-disk files are a 'snapshot' of what the system looks like at the moment. While you could get something like a link written to the links database, but not the alternate links, it's unlikely.

Every time Links builds it makes a back up of the 3 main data files, and you can expand that to any tables you want.

If you run a system backup on a weekly or daily basis, the .../data directory can just be copied.

Every so often I'll copy/tar/gz the data directory into specific backups.

With the features of MySQLMan, I would not be surprised to see full, selective and incremental type backups available for Links SQL when it's finally released in the next version.

http://www.postcards.com
FAQ: http://www.postcards.com/FAQ/LinkSQL/

Quote Reply
Re: backup tactic ? In reply to
Thanks guys,

Well i go for the backup dir. Thanks...

The mailman just bring my MySQL book from Paul DuBios (New Riders) so i will have something to do:)..

Regards Startpoint.

Quote Reply
Re: backup tactic ? In reply to
I really like:

mysqldump --add-drop-table -uUSERNAME -pPASSWORD DATABASENAME > backup.txt

which will create a "snapshot" of your database. Then to restore, simply type:

mysql -uUSERNAME -pPASSWORD DATABASENAME < backup.txt

and presto, it's back to the state it was before. You can use MysqlMan to do this from the web using the SQL Dump, and SQL Monitor functions, however it's quite simple, so you are probably better off putting it in a shell script that gets run automatically.

Cheers,

Alex

--
Gossamer Threads Inc.
Quote Reply
Re: backup tactic ? In reply to
Thanks for the reply Alex.

Well that was what i was thinking the first time...

Eh.... Thanks... what shall i do :)...

Maybe i rewrite nph-build so there will be backup.txt backup1.txt instead of all seperated files right ?
Will that be a easy mod ?

Regards Startpoint.

Quote Reply
Re: backup tactic ? In reply to
Sure you could add:

system ("/path/to/mysqldump --add-drop-table -uUSERNAME -pPASSWORD DATABASENAME > /path/to/backup.txt");

in your nph-build. This can take a while, especially on large databases with large search indexes. You could also try and make the filename it saves to dynamic so you can store the last 7 backups or something.

Cheers,

Alex

--
Gossamer Threads Inc.
Quote Reply
Re: backup tactic ? In reply to
Alex,

Great suggestions...although I cannot use your method since my MySQL password uses a special function character...Wink

I keep getting this error message:

Code:

/usr/local/bin/mysqldump: Got error: 1045: Access denied for user: 'anthrotech@localhost' (Using password: YES)


I did replace PASSWORD and USERNAME and DATABASENAME with my correct access information.

I did come up with an alternative solution that saves relevant tables and then puts the .db backup files into a tar.gz file. Also, I hacked the backup routine a bit to only keep one of the latest backups per day...the problem that I saw in the $max_keep variable is that it keeps only seven of the last backups and if you run the build script 6 or 7 times a day (like I do), then all you will get is seven backup copies for that day...and not consecutive days. Smile

Regards,

Eliot Lee

Quote Reply
Re: backup tactic ? In reply to
For anyone wondering, all you need to do is when you attempt the backup, check whether the file you are trying to create already exists. If it does, print out a warning that "backup for today already exists .... skipping"

Then, if you know you didn't do a backup, something is wrong, but if you did, you still have your _original_ backup for the day.

Next time the day changes, the filename for that day will increment, the file won't exist, so it can be written... simple. Effective. Used on my database before I went to MySQL, so I would always have one backup per day.

Your filename would look like the standard log backup -- backupname.yyyy-mm-dd.ext




http://www.postcards.com
FAQ: http://www.postcards.com/FAQ/LinkSQL/

Quote Reply
Re: backup tactic ? In reply to
Welp...pugdog...backups were succesfully made, however, the only backup files stored in my directory were:

Code:

Categories.db
Categories1.db
Categories2.db
Categories3.db
Categories4.db
Categories5.db
Categories6.db
Categories7.db
Links.db
Links1.db
Links2.db
Links3.db
Links4.db
Links5.db
Links6.db
Links7.db


That's all...no day values were associated with the backup files...and all that happened was that backups stopped and I had to clear out my backup directory to get backups to work again...so, that is why I extracted the sub backup routine out of the nph-build.cgi and tweaked it to do the following:

1) Export data from relevant Tables.
2) Create a tar.gz for the day.
3) Then clear out all the separate .db files.
4) Check to see that there is backup for the day, if there is, skip the backup process.

I've also set the backup script to run nightly at 23:00:00, so that I only have the latest backup file for the day.

Regards,

Eliot Lee

Quote Reply
Re: backup tactic ? In reply to
Don't get me wrong.... There's more to rotating logs, but you can either do it that way, or you can check the file date to see if a backup was made for 'today'.

You always have to knock the last off the queue -- FIFO -- but if you use the 1-7 notation, you have to rename all the logs in sequence after deleting log xxx1

if you use the backupname.date.ext notation, you can cut the .date. out and do a simple compare, toss the oldest, and write out a new one.

It's not a single line of code, but it's not really tricky either. And, the file name contains the date of the backup, so you don't have to rely on timestamp information which often isn't preserved via FTP.

http://www.postcards.com
FAQ: http://www.postcards.com/FAQ/LinkSQL/

Quote Reply
Re: backup tactic ? In reply to
Thanks for the comments...I appreciate them...but I have come up with a pretty good solution that fits my purposes.

Regards,

Eliot Lee

Quote Reply
Re: backup tactic ? In reply to
Eliot,

Can you share the hack/mod (1 backup once day sound fine to me) ?

Allready thanks.

Regards Startpoint.

Quote Reply
Re: backup tactic ? In reply to
It is a relatively rudimentary Mod....

The benefits of this Mod are the following:

1) Reduced disk space...(the current backup process stores about 3 MGs of data into the backup directory per day versus about 200KB of data via the compressed tar.gz file).

2) Keeps latest data - stores latest data of the day rather than 5 or 7 backup copies per day.

3) Easier Storage and Data Transfer - You can easily download the tar.gz file rather than waiting to download an abundance of backup copies.

Here is what you do...

1) Add the following variables in your Links.pm file:

Code:

# Backup Path - No Trailing Slash
$LINKS{db_backup_path} = "$LINKS{admin_root_path}/backup";
# Tar Path - No Trailing Slash
$LINKS{tar_path} = "/usr/local/bin/tar";
# GZIP Path - No Trailing Slah
$LINKS{gzip_path} = "/usr/local/bin/gzip";


To find the tar and gzip paths in your server, simply type in the following commands at the telnet command prompt:

Code:

which tar
which gzip


OR

Code:

where tar
where gzip


You will also have to make sure that your backup directory exists in your admin directory...make sure that the directory permission is worldwide readable/writable (777).

2) Then add DBSQL objects before the sub build_all routine in the nph-build.cgi file...

Example:

Code:

$CATALT = new Links::DBSQL "$LINKS{admin_root_path}/defs/CategoryAlternates.def";


NOTE: I have added most of my tables into the backup process.

You will also have to add the DBSQL object variables in the use vars qw(....); line of codes in the Load Modules section of the nph-build.cgi file.

3) Then replace your sub build_backup routine with the following codes:

Code:

sub build_backup {
# --------------------------------------------------------
# Backs up important database files.
#
my $today = $LINKDB->get_date;
my $time = $LINKDB->get_time;

print "\t=========================================\n";
print "\tBacking up Database....\n";
print "\t=========================================\n";

if (-e "$LINKS{db_backup_path}/$today.tar.gz") {
print "\t=========================================\n";
print "\tCompressed Backup File exists for today.. Skipping\n";
print "\t=========================================\n";
return;
}
else {
print "\tBacking up Links... \n";
$LINKDB->export_data ( { file => "$LINKS{db_backup_path}/Links$today.db", header => 1 } );

print "\tBacking up Category... \n";
$CATDB->export_data ( { file => "$LINKS{db_backup_path}/Categories$today.db", header => 1 } );

print "\tBacking up Alternative Category... \n";
$CATALT->export_data ( { file => "$LINKS{db_backup_path}/AltCategories$today.db", header => 1 } );

print "\tBacking up Users... \n";
$USERDB->export_data ( { file => "$LINKS{db_backup_path}/Users$today.db", header => 1 } );
}
if (open (TAR, "$LINKS{tar_path} cf $LINKS{db_backup_path}/temp.tar $LINKS{db_backup_path}/* |")) {;
close TAR;
open (GZ, "$LINKS{gzip_path} -c $LINKS{db_backup_path}/temp.tar > $LINKS{db_backup_path}/$today.tar.gz |") or die($!);
print "\t=========================================\n";
print qq"\tFile Compression Completed... \n";
print "\t=========================================\n";
close GZ;
unlink("$LINKS{db_backup_path}/temp.tar") or die($!);
system ("rm $LINKS{db_backup_path}/*.db");
}

print "\t=========================================\n";
print "\tScript Completed on $today at $time\n";
print "\t=========================================\n";
}


You will notice that I have added AltCategories and Users tables.

NOTE: As Alex stated...backing up the whole database either through mysqldump or through the above method is time consuming and you might want to consider creating a separate script for the backups...I have done this and the script looks like the following:

Code:

#!/usr/local/bin/perl
# Backup WWWVL: Anthropology MySQL Database
# File Name: nph-backup.cgi
# Description: Backups all pertinent tables in the MySQL Database.
#=======================================================================

# Load required modules.
# ---------------------------------------------------
use lib '/mnt/web/guide/anthrotech/cgibin/vlib';
use CGI ();
use CGI::Carp qw/fatalsToBrowser/;
use Links::Links;
use Links::DBSQL;
use Links::DB_Utils;
use Links::HTML_Templates;
use strict;
use vars qw($USE_HTML $BANNERDB $LINKDB $CATDB $CATALT $EDITDB $USREVDB $USERDB);
$|++;

# Determine whether we should print HTML or text.
$USE_HTML = 0;
$ENV{'REQUEST_METHOD'} and ($USE_HTML = 1);

# Create the DBSQL objects we will use.
$BANNERDB = new Links::DBSQL "$LINKS{admin_root_path}/defs/Banners.def";
$LINKDB = new Links::DBSQL "$LINKS{admin_root_path}/defs/Links.def";
$CATDB = new Links::DBSQL "$LINKS{admin_root_path}/defs/Category.def";
$CATALT = new Links::DBSQL "$LINKS{admin_root_path}/defs/CategoryAlternates.def";
$EDITDB = new Links::DBSQL "$LINKS{admin_root_path}/defs/Editor_Reviews.def";
$USREVDB = new Links::DBSQL "$LINKS{admin_root_path}/defs/User_Reviews.def";
$USERDB = new Links::DBSQL "$LINKS{admin_root_path}/defs/Users.def";

# Determine what type of build we are doing and do it.
if ($USE_HTML) {
CGI->nph(1);
print CGI->header(), CGI->start_html ( -title => 'Backing up Database' ), CGI->h1('Backing up Database ... '), "<pre>";
}
my $s = time();
&main ();
$USE_HTML and print "</pre>";

sub main {
# --------------------------------------------------------
# Backs up important database files.
#
my $today = $LINKDB->get_date;
my $time = $LINKDB->get_time;

print "\t=========================================\n";
print "\tBacking up Database....\n";
print "\t=========================================\n";

if (-e "$LINKS{db_backup_path}/$today.tar.gz") {
print "\t=========================================\n";
print "\tCompressed Backup File exists for today.. Skipping\n";
print "\t=========================================\n";
return;
}
else {
print "\tBacking up Banner... \n";
$BANNERDB->export_data ( { file => "$LINKS{db_backup_path}/Banners$today.db", header => 1 } );

print "\tBacking up Links... \n";
$LINKDB->export_data ( { file => "$LINKS{db_backup_path}/Links$today.db", header => 1 } );

print "\tBacking up Category... \n";
$CATDB->export_data ( { file => "$LINKS{db_backup_path}/Categories$today.db", header => 1 } );

print "\tBacking up Alternative Category... \n";
$CATALT->export_data ( { file => "$LINKS{db_backup_path}/AltCategories$today.db", header => 1 } );

print "\tBacking up Editor Reviews... \n";
$EDITDB->export_data ( { file => "$LINKS{db_backup_path}/Editor_Reviews$today.db", header => 1 } );

print "\tBacking up User Reviews... \n";
$USREVDB->export_data ( { file => "$LINKS{db_backup_path}/User_Reviews$today.db", header => 1 } );

print "\tBacking up Users... \n";
$USERDB->export_data ( { file => "$LINKS{db_backup_path}/Users$today.db", header => 1 } );
}
if (open (TAR, "/usr/local/bin/tar cf $LINKS{db_backup_path}/temp.tar $LINKS{db_backup_path}/* |")) {;
close TAR;
open (GZ, "/usr/local/bin/gzip -c $LINKS{db_backup_path}/temp.tar > $LINKS{db_backup_path}/$today.tar.gz |") or die($!);
print "\t=========================================\n";
print qq"\tFile Compression Completed... \n";
print "\t=========================================\n";
close GZ;
unlink("$LINKS{db_backup_path}/temp.tar") or die($!);
system ("rm $LINKS{db_backup_path}/*.db");
}

print "\t=========================================\n";
print "\tScript Completed on $today at $time\n";
print "\t=========================================\n";
}


Notice that I have extra tables like User Reviews and Editor Reviews...you can simply replace these with other relevant tables.

4) Then you will have to create a crontab.

NOTE: There are codes above that will check to see if the backup exists for today...the best method is to use the separate backup script and set the crontab to 23:00:00 so that you get the latest data of the day into the backup.tar.gz file.

Regards,

Eliot Lee

Quote Reply
Re: backup tactic ? In reply to
Thanks pale, nice work !!!

Which i you use or do you use both ?

Regards Startpoint.
Quote Reply
Re: backup tactic ? In reply to
Hello,

If i look in a .db file the new line [ENTER] is replace by ~~.

When i have to restore this file will it be a [ENTER] again? Is this the way mysql store new lines?

Allready thanks

Regards Startpoint.

Quote Reply
Re: backup tactic ? In reply to
The problem is probably related to the earlier problems you were experiencing with the EXPORT/IMPORT functions of MySQL...the codes are not much different than the standard backup process, which EXPORTS tables from the database into flat files.

You will probably have to tweak the files before IMPORTING them into your database for restoration purposes....But it will be less work than having to re-input data into the MySQL database.

Regards,

Eliot Lee

Quote Reply
Re: backup tactic ? In reply to
You should use one of the above provided methods. I would recommend using the stand-alone version since you will have the most current backup and also the other reasons I stated above. If you use the internal modification to the nph-build.cgi be prepared for the build process to dramatically slow down and you might risk the chance of timing out the build process.

Regards,

Eliot Lee