This blog post is an example of how to use the linux "cat" command or linux "head" command on a file that has been compressed with gzip. The following example uses a mysqldump file as part of the example.
Create a MySQL dump and compress it with gzip:
mysqldump -u USERNAME -p -h MY_MASTER_SERVER --all-databases --master-data --add-drop-table --add-drop-trigger --single-transaction --events --triggers --routines | gzip > dump.sql.gz
Restore a MySQL dump that was compressed:
gunzip < dump.sql.gz | mysql -u USERNAME -p -h MY_SLAVE_SERVER
View contents of a compressed file:
gzip -cd dump.sql.gz | head -n25
I used the above trick to grab the master log file and log position from a dump file that is compressed like so in a bash script:
local loc_active_master_log_file=`gzip -cd dump.sql.gz | head -n25 | grep -oh '\w*mysqld-bin.\w*'`
local loc_active_master_log_pos=`gzip -cd dump.sql.gz | head -n25 | grep -o -P '(?<=MASTER_LOG_POS=)[0-9]+'`
Inside my bash script, I then continue to add a slave server into the cluster using the variables after the import has finished:
STOP SLAVE;
RESET SLAVE;
CHANGE MASTER TO MASTER_HOST='MY_MASTER_SERVER',
MASTER_USER='${replication_user}',
MASTER_PASSWORD='${replication_password}',
MASTER_LOG_FILE='${loc_active_master_log_file}',
MASTER_LOG_POS=${loc_active_master_log_pos};"
START SLAVE;
No comments:
Post a Comment