Sync Live & Test Site
Permalink
We have a live site and a test site built on C5. We want to sync the test site so we can train on the test site with the most current stuff from the live site. In PHPMyAdmin and we tried the sync function but it didn't seem to work. Any suggestions?
When we run the C5 databse back up we get the following erro.
Fatal error: Maximum execution time of 30 seconds exceeded in /var/www/clients/client1/web1/web/www/updates/concrete5.5.2.1/concrete/libraries/3rdparty/adodb/drivers/adodb-mysql.inc.php on line 640
Fatal error: Maximum execution time of 30 seconds exceeded in /var/www/clients/client1/web1/web/www/updates/concrete5.5.2.1/concrete/libraries/3rdparty/adodb/drivers/adodb-mysql.inc.php on line 640
Export database from shell:
Import database from shell:
And if you want to sync on schedule (e.g. every midnight), run those two jobs in cron (although in that case you'd have to provide also the mysql password for the two calls above included in them).
mysqldump -h mysql_host -u mysql_user -p mysql_database > export.sql
Import database from shell:
mysql -h mysql_host -u -mysql_user -p mysql_database < export.sql
And if you want to sync on schedule (e.g. every midnight), run those two jobs in cron (although in that case you'd have to provide also the mysql password for the two calls above included in them).
I'm joining late to an old thread. Hopefully I didn't miss something newer when I searched. Anyway, here goes.
Syncing databases is a pain and one reason I've been experimenting (and failing) with NoDB CMS approaches such as Kirby.
The scenario I struggle with is the develop-deploy-revise-deploy cycle.
Once the site has gone live, there are a least two important types of data in the database: the content and the structure. Blocks store their configuration in the database and pages store their block structure there, for example.
So if revisions to the site are needed, how do you make changes to the structural information in the database without over writing the new content? While the revisions are in development, it is likely that new content is being added to the live site. When the revised version is ready to be deployed live, the live database has the "right" content data, but the development database has the "right" structural data.
Bob
Syncing databases is a pain and one reason I've been experimenting (and failing) with NoDB CMS approaches such as Kirby.
The scenario I struggle with is the develop-deploy-revise-deploy cycle.
Once the site has gone live, there are a least two important types of data in the database: the content and the structure. Blocks store their configuration in the database and pages store their block structure there, for example.
So if revisions to the site are needed, how do you make changes to the structural information in the database without over writing the new content? While the revisions are in development, it is likely that new content is being added to the live site. When the revised version is ready to be deployed live, the live database has the "right" content data, but the development database has the "right" structural data.
Bob
My general practice is in this howto:
http://www.concrete5.org/documentation/how-tos/developers/organise-...
It works for general development as well as core upgrades.
The difficulty comes when a live site is constantly changing (say an ecommerce site), so any snapshot/clone made for further development becomes out of date, leading to the risk of skew you have described. With such, any major enhancement I make as a 2 stage process:
- Stage 1, take a clone, do the work on the clone, work out the bugs. Make notes. Repeat until prefect.
- Stage 2, put the site in maintenance mode (so to prevent customers creating skew in the data), take a clone, repeat the now well proven work from my notes, test, switch the domain pointer.
An alternative is 2a, where rather than using maintenance mode, we accept there will be some data skew and patch it manually afterwards. 2a only works for a less-busy site and cant work if there are 100s of orders while upgrading.
An ideal solution would be with some kind of transaction logging, where the customer registration and orders, or forum posts, could be taken and re-applied to the new site to bring changes in line. Unfortunately that capability does not exist (I expect because it is not a simple thing to do reliably).
http://www.concrete5.org/documentation/how-tos/developers/organise-...
It works for general development as well as core upgrades.
The difficulty comes when a live site is constantly changing (say an ecommerce site), so any snapshot/clone made for further development becomes out of date, leading to the risk of skew you have described. With such, any major enhancement I make as a 2 stage process:
- Stage 1, take a clone, do the work on the clone, work out the bugs. Make notes. Repeat until prefect.
- Stage 2, put the site in maintenance mode (so to prevent customers creating skew in the data), take a clone, repeat the now well proven work from my notes, test, switch the domain pointer.
An alternative is 2a, where rather than using maintenance mode, we accept there will be some data skew and patch it manually afterwards. 2a only works for a less-busy site and cant work if there are 100s of orders while upgrading.
An ideal solution would be with some kind of transaction logging, where the customer registration and orders, or forum posts, could be taken and re-applied to the new site to bring changes in line. Unfortunately that capability does not exist (I expect because it is not a simple thing to do reliably).
You can do the same for /files and /packages via your site host console by zipping them up and downloading and unzipping. (You may want to leave the cache and temporary files out of it.)
Be careful to always keep the live site as the master.