MySQL has a batch mode that takes any SQL command from a file:
$ mysql < batchfile
batchfile contains the sql commands, e.g. one per line. This can save some inter-action.
I just wanted to upgrade some web software. But I didn’t wanted to just update over (the life site), but to be a bit more clever and test upfront if a migration will turn out flawlessly. So basically to copy over the files and the tables in the MySQL database and change the db table prefix configuration in the copied directory to run a test-update on that copy.
Probably not high computer science but better than crashing a life site. There is always something that you don’t expect. Like an incompatible plugin (esp. wordpress, not to talk about themes) or because it’s just some software you don’t know well and you just don’t want to screw it. Better save than sorry; Backup first.
In my case (if of interest), it’s the MyBB forum software, a nice tool by the way.
But anyway, why not use the mysql command itself to create a bunch of SQL-queries and run them in batch mode? Repeat = Batch.
Here we go. If you do not know the mysql command’s basic settings like host, username, password and database, reflect about these basic switches first as you might need to configure those in each
mysql call otherwise the connection to the database will fail.
MySql connect cheatline:
mysql --host=hostname -u username --password=password dbname
Next thing good to know is that instead of using
SHOW TABLES; you can query the
INFORMATION_SCHEMA database in MySQL to gather information about the structure of the whole database. Comparable to reflection in PHP. The MySQL manual has the info as usual.
So basically tools are rightly set for some quick’n’dirty data migration. Don’t try this with really large databases (esp. with much data on MySQL) because you might run into problems. Often this is not the case for regular, smaller websites, so I don’t cover this. Let’s just assume stuff works as documented and immediatly (that’s the important adjective). I mean just consider that copying the table will bring your life site down because it just takes too long. Only that I have said it.
What follows is what I did. It’s a pretty new forum, some activity, not that much (if you have traffic on a database intensive site, consider PostgreSQL, but this post is about MySQL). The default table prefix in the MyBB forum software is (surprise, surprise),
mybb_. It works for anything with it’s own table prefix as well, e.g. wordpress. This is basically what I did:
Copy (or rename) multiple tables in a MySQL Database
$ mysql --skip-column-names -e "SELECT CONCAT('CREATE TABLE IF NOT EXISTS copy', table_name, ' LIKE ', table_name, '; INSERT INTO copy', table_name, ' SELECT * FROM ', table_name, ';') FROM INFORMATION_SCHEMA.TABLES WHERE table_schema='DATABASENAME';" > copyqueries  $ mysql < copyqueries 
 Creates the file
copyqueries with one result per line of the query specified by
-e. Take note of DATABASENAME which needs to be configured. The
--skip-column-names switch takes care that the column name is not put in the first line. The result of this query creates the batch file for storing all SQL-queries to copy all tables of the database.
 Execute and voila, list your tables and you’ll see everything copied over.
And as I’m not a linux commandline wizard already, I’m pretty sure that this can be even more optimized, but I was confident to give this a run this way for the moment. Feel free to leave some comments if you know better (like putting the output of  directly into mysql batchmode. I’m a bit conservative on this one but eager to learn more).
To further specify the tables that need to be copied, the WHERE-clause can be extended. For example, to only copy tables with a certain prefix in their name:
WHERE table_schema = 'DATABASENAME' AND table_name LIKE ('mybb_%')
Bonus: If you want to get all lines but the first one from a file,
tail -n+2 file does the job. I did not need this in the end because of the