You can import a dump on the SQM Monitor screen. You can give it a local or remote file name.
I _strongly_ suggest when dumping the database to dump the table structure only (no data) to one file, then the data to another file. This lets you rebuild the database as well as the data easily.
If you have a lot of links, I would dump the links separately, into it's own file, with "Create table" statements included. Then dump the rest of the files into their own file.
The _only_ files you really need are the Links, Categories and Users, as long as you have done a recent "build" of the site. All the other tables are either visitor tracking and updates or generated by the re-index and build.
The problem with large databases is that the HTML process will time out on large imports (dumps are much, much faster). You may have to do it via telnet.
In order to do it via telnet, you need to either write out the "insert statements with each field named" or you need to have a database set up _exactly_ like the data, with _all_ the fields of _all_ the tables in the same order. Then, you can just use the 'mysql' command directly and feed the file into the program using the '<' character.
The "dumpfile" is really just a lot of SQL commands, one after another, which is why it's portable from machine to machine.
The mysql database engine views the file as a user typing in commands very, very fast :) <G>
PUGDOGŪ
PUGDOGŪ Enterprises, Inc.
FAQ: http://postcards.com/FAQ
I _strongly_ suggest when dumping the database to dump the table structure only (no data) to one file, then the data to another file. This lets you rebuild the database as well as the data easily.
If you have a lot of links, I would dump the links separately, into it's own file, with "Create table" statements included. Then dump the rest of the files into their own file.
The _only_ files you really need are the Links, Categories and Users, as long as you have done a recent "build" of the site. All the other tables are either visitor tracking and updates or generated by the re-index and build.
The problem with large databases is that the HTML process will time out on large imports (dumps are much, much faster). You may have to do it via telnet.
In order to do it via telnet, you need to either write out the "insert statements with each field named" or you need to have a database set up _exactly_ like the data, with _all_ the fields of _all_ the tables in the same order. Then, you can just use the 'mysql' command directly and feed the file into the program using the '<' character.
The "dumpfile" is really just a lot of SQL commands, one after another, which is why it's portable from machine to machine.
The mysql database engine views the file as a user typing in commands very, very fast :) <G>
PUGDOGŪ
PUGDOGŪ Enterprises, Inc.
FAQ: http://postcards.com/FAQ