On my local computer I have a test environment running XAMPP. I am working on a PHP / MySQL application.
Now I often make changes to the local website, but when I want to upload I have to do a lot of work. This is what I do now.
I export my local DB trough phpMyAdmin
I clear my remote DB I import the exported DB.
Sometimes the DB is kind of large so it takes a lot of time. Is there a script I can use that speeds up this process.
I have no remote access to the production DB
When I make changes to the local DB the website is turned off, so its just a mirror when done
You could use Phing's DBDeploy Task to manage your database migrations. This has the advantage that any changes to your database become an integral part of your versioning system.
Here is an article describing how to do it:
Alternatives are Rob Allen's Akrabat_Schema_Manager and I think Doctrine has one too.
Here is a backup and migration script that I use
#!/bin/bash
email="email@addresswherebackupissent"
dbhost="dbhost"
testdbname="testdbname"
productiondbname="productiondbname"
dbuser="dbuser"
dbpass="userpass"
mkdir -p ~/archives
cd ~/archives
suffix=$(date +%y%m%d)
echo "SQL Backup of \"$dbname\" attached" >> tmp.txt
mysqldump --opt -u $dbuser -p$dbpass -h $dbhost $testdbname > SQL_"$testdbname"_$suffix.sql
mysql -u $dbuser -p$dbpass -h $dbhost $productiondbname < ../backup/drop/drop.sql
mysql -u $dbuser -p$dbpass -h $dbhost $productiondbname < SQL_"$testdbname"_$suffix.sql
tar zcf SQL_Backup_"$testdbname"_$suffix.tar.gz *.sql
mutt -s "SQL Backup of \"$testdbname\"" -a SQL_Backup_"$testdbname"_$suffix.tar.gz $email < tmp.txt
rm -rf {*.sql,tmp.txt}
This script can be run executed from the shell and will backup your DB and email it to you, and then . The file drop.sql is a copy of the drop commands phpmyadmin uses to clear the production site's database.