JavaChap Blog Java and Technology musings for the masses

15Oct/120

Script to Backup MySQL and CVS to Amazon S3

Here is a simple script that i wrote to backup mysql and cvs root folder to Amazon S3. For the script to work you need to have S3Sync to be in path.

#!/bin/bash

# Database Configurations
BUCKET=backup.javachap.com
DB_USER=backup
DB_PWD=readonly
NOW=$(date +_%m_%d_%y)

S3_BACKUP=/opt/s3backup
CVSROOT=/usr/local/cvsroot
CVS_BACKUP_NAME=jc_cvs_backup$NOW.tar.gz
MYSQL_BACKUP_NAME=jc_mysql_backup$NOW.sql.gz

cd $S3_BACKUP

# tar the cvsroot folder
tar czf $CVS_BACKUP_NAME $CVSROOT

# list MySQL databases and dump each
DB_LIST=`mysql -u $DB_USER -p"$DB_PWD" -e'show databases;'`
DB_LIST=${DB_LIST##Database}
for DB in $DB_LIST;
do
  FILENAME=mysql_${DB}${NOW}.sql.gz
  mysqldump -u $DB_USER -p$DB_PWD --opt --flush-logs $DB | gzip > $FILENAME
done

tar czvf $MYSQL_BACKUP_NAME mysql*
rm -f mysql*

cd /opt/s3sync
ruby s3sync.rb -r -v -s --exclude="archieves$|s3backup.sh" /opt/s3backup/ $BUCKET:

mv /opt/s3backup/*.gz /opt/s3backup/archieves/

1Sep/110

The 5 Biggest Ways to Boost MySQL Scalability

Sean Hall with some really good MySQL advice:

  1. Tune those queries. Biggest bang for your buck. Enable the slow query log and watch it. Once you've found a heavy resource intensive query, optimize it! Tune what receives real-world traffic.
  2. Employ Master-Master Replication. Immediately have a read-only slave for your application to hit as well.
  3. Use Your Memory. Set innodb_buffer_pool_size, key_buffer_size and other key options.
  4. RAID Your Disk I/O. Use RAID 10 mirroring and striping. On EC2 striping across a number of EBS volumes using the Linux md software raid.
  5. Tune Key Parameters. speeds up inserts & updates with innodb_flush_log_at_trx_commit=2; create a tablespace and underlying datafile for each table with innodb_file_per_table.

Full details at the original article.

15Nov/100

MySQL Import and Export

Here are few MySQL Queries/commands that i regularly use for exporting and importing MySQL data.  This is for my reference, but just thought this might be useful for newbies.

MySQL Table Data Export and Import
  • To export the just table data please use the following query
    SELECT * INTO OUTFILE 'data.txt' FIELDS TERMINATED BY ',' FROM mytable;
    
  • To import the csv file into table execute the following SQL Query.
    LOAD DATA INFILE 'data.txt' INTO TABLE mytable FIELDS TERMINATED BY ',';
    
Copy the contents of one database to another
  • First, create the new database
    create database db2;
    
  • Then, from the command-line, do the following to copy the contents from db1 to db2.
    $ mysqldump -u root --password=pass db1 | mysql -u root --password=pass db2
    
MySQL Dump
  • To export the MySQL database into a dump file, please execute the following command
    $ mysqldump -u root --password=pass db1 > dbdump.sql
    
Create MySQL User
  • Creating MySQL user and granting access
    CREATE USER 'myuser'@'localhost' IDENTIFIED BY 'myuser';
    GRANT ALL ON *.* TO 'myuser'@'localhost' IDENTIFIED BY 'myuser';
    
27Dec/0852

CRUD Application : Struts 1, Hibernate and MySQL