Shell script to export data from MySQL tables using SELECT INTO OUTFILE method, with column names
I'd previously posted about exporting MySQL data into csv/psv/tab delimited files by redirecting output of SELECT with pattern replacements and briefly mentioned the SELECT INTO OUTFILE method. Both are fine, but SELECT INTO OUTFILE method is faster.
Also, loading data into tables using LOAD DATA INFILE is much, much faster than doing an INSERT on each row individually. Recently I was experimenting with different schema and partitioning options for a database with over 12 million rows and with LOAD DATA INFILE method I was able to load 12 mil rows worth of data in just 47 sec.
Anyway, here's a script I used to export every table from a database using the SELECT INTO OUTFILE method. By default, with this method, there seems to be no way to export column names as the first line of output, so the script will insert those as well.
#!/bin/bash
# Script to export all tables from a MySQL DB using the SELECT INTO
# OUTFILE method and to add table column names to dump files that this
# method otherwise doesn't allow. Requires superuser privileges to
# copy export files out of the MySQL data dir.
DB=MyDB # change me!
export_dir="/path/to/dir" # change me!
myopts="--skip-column-names -Be" # add username/password if not using ~/.my.cnf
err_exit()
{
echo -e 1>&2
exit 1
}
mysql_data_dir=$(mysql $myopts "show variables like 'datadir'" | awk '{sub(/\/$/,"");print$NF}')
if ! (mysqlshow $DB 1>/dev/null); then
echo ERROR: unable to access database
exit 1
fi
if ! [ -w $export_dir ]; then
echo ERROR: export dir is not writable
exit 1
fi
if ! (sudo touch $mysql_data_dir/$DB/test 2>/dev/null); then
echo ERROR: this script will need sudo access to
echo move exported files out of mysql data dir.
echo Come back when you get sudo, son.
exit 1
else
sudo rm $mysql_data_dir/$DB/test
fi
ExportTable()
{
mysql $DB $myopts "
SELECT * INTO OUTFILE '$T.psv'
FIELDS TERMINATED BY '|'
LINES TERMINATED BY '\n'
FROM $T
"
}
AddColumnHeaders()
{
cols=$(mysql $DB $myopts "desc $T" | awk '{print$1}' | tr '\n' '|' | sed 's/|$//')
sed -i "1i \
$cols
" $export_dir/$T.psv
}
mysql $DB $myopts "show tables" |
while read T; do
echo exporting $T
ExportTable || err_exit
sudo mv $mysql_data_dir/$DB/$T.psv $export_dir/ || err_exit
AddColumnHeaders || err_exit
done
1 Comment
1. Sergi replies at 9th July 2013, 5:18 am :
Hi Alain, I have a very big table with millions registers.
Do you think It would be possible to export only data betweeen 2 dates ?
I have an event_date field (timestamp) I would like to call the script In the following
way:
myscript “2013-06-01 00:00” “2013-06-30 23:59” backup_2013_06.sql
myscript “2013-05-01 00:00” “2013-05-31 23:59” backup_2013_05.sql
do you think that would be a good backup system ? I don’t know if outfile command supports a select first.
Thanks
Leave a comment