Managing The Database After Deployment

Permalink
I use the typical development workflow of working on a local server, deploying to a staging server for additional testing, and then deploying to the live server. Dealing with the database is not much trouble at the local and staging server points. But once deployed live, I struggle.

When the live site needs some form of revision, that often includes changes to the database content, or perhaps even structure.

What techniques are being used in the concrete5 developers community in handling syncing the live database and the development database?

Bob

bobrocke
 
JohntheFish replied on at Permalink Reply
JohntheFish
For major changes I clone the live site back to a development site, test out changes, then having learned what not to do I can make changes to the live site.

For core upgrades, I generally work via clone and swap the domain pointer when complete as per:
http://www.concrete5.org/documentation/how-tos/developers/organise-...

When I have plenty of time to spare, I am going to look at using mySQL to sync databases, made possible by:
http://www.concrete5.org/documentation/how-tos/developers/install-c...

What I would really like to find time to do is create a page export/import utility that will package up everything needed to re-create a page on another site, either as an addon, or as a core pull request.
bobrocke replied on at Permalink Reply
bobrocke
I would buy a utility like this.

Bob
jordanlev replied on at Permalink Reply
jordanlev
This is indeed a problem (for all CMS's, or any system where content or configuration is stored in a database).

The solution my co-workers and I use is to treat the live site's database as the "one source of truth". When we need to work on it locally, we check out the code from our git repository (or if you aren't using version control you'd just edit the site files that are already on your local computer, assuming you didn't change anything on the server). The we go to the live site and create a database backup via the dashboard. The we ZIP up the entire /files/ directory (which includes the database backup -- although you'll need to change permissions on that file before ZIP'ing otherwise it won't get included). Then we download the ZIP file and re-create the database on our local machine and replace the /files/ directory with what we just downloaded from the live site.

It is a pain in the neck, although we have developed internal tools that do this for us in one or two clicks. If you don't have the capabilities or time to automate this stuff yourself, I'd highly recommend the most excellent "Backup Voodoo" addon (made by JohntheFish -- I think he was being modest by not suggesting it himself).

If you come up with your own method, please share -- I'm always interested in hearing about ways to streamline the process further.

(But the bottom line is: you are not alone, it's a problem for everyone and there is no magic bullet solution).
bobrocke replied on at Permalink Reply
bobrocke
Thanks to both JohntheFish and jordanlev for their detailed and thoughtful answers. And I suppose the situation is as I feared - very difficult to revise the site once it's deployed without the risk of unsynchronized data or the need to take the live site into maintenance mode for an extended period.

The WordPress community has WP Migrate DB Pro (http://deliciousbrains.com/wp-migrate-db-pro/) which works pretty well keeping MySQL database in sync. I've contacted them about a more general purpose application for other uses of MySQL, but they are not considering that now.

Another option, one that would require much too much change for current concrete5 deployments, would be to split concrete5's database use onto two databases - one for content and one for configuration. But I doubt the developers would even consider that.

So it looks like are careful, manual process is the answer at this point.

Bob
mkly replied on at Permalink Reply
mkly
Have you taken a look at this?

https://github.com/danklassen/consh...

It has some nice stuff for pulling down db's etc when you need to update your local copy.

Best Wishes,
Mike
bobrocke replied on at Permalink Reply
bobrocke
There's some interesting stuff there. I need to take a closer look, but many of the servers I deal with don't allow SSH access. :(

Bob
bobrocke replied on at Permalink Reply
bobrocke
The MySQL WorkBench app claims to do database syncing between two hosts:http://dev.mysql.com/doc/workbench/en/wb-database-synchronization.h...

Has anyone experimented with that and have an opinion on it?

If not, I'll try to set something up and see what it does for a simple case.

Bob
jordanlev replied on at Permalink Reply
jordanlev
I encourage experimentation with this, and would love to hear back about them.

But the curmudgeonly side of me thinks that sycing databases is too "dangerous", and for the sake of safety I prefer to consider one database as the ultimate source of truth and always do full copies of that from one site to another (fully overwriting the "other" in the process).

The reason I believe this is because the C5 system schema is so complex -- for example, a single edit to a page (say, updating a content block) would require changes to btContentLocal, CollectionVersions, and CollectionVersionBlocks (and maybe more -- that's just off the top of my head). There may also be other records that are relying on a certain Block ID, which has now changed because you updated the block. But what if the same block was edited in a different way on another site -- how can you realistically know how to untangle the various changes across many different tables?

I'd love to be wrong about this though -- if there were a "safe" and easy way to sync two databases, it would be amazingly helpful!

-Jordan
JohntheFish replied on at Permalink Reply
JohntheFish
I had a bit of a think about an individual page-by-page export / import with all associated odds and ends.

Basically exporting a list of instructions to create or replace a page, adding blocks and attributes with any associated images from the file manager.

It may be feasible for pages with simple blocks, but as soon as a block has a secondary data table could only be achieved with the cooperation of the block controller providing its own export/import assistance. There are many other complications involved, but secondary data is the really big one.

Maybe we can get Andrew to give some thoughts on the next Totally Random.
bobrocke replied on at Permalink Reply
bobrocke
Well, it appears that MySQL Workbench will only synchronize the database structure, not the data. :(

There are some programs for Windows that claim to synchronize data, but I want a Mac solution. Navicat for MySQL says it will do it and runs on Mac:http://www.navicat.com/products/navicat-for-mysql...

Has anyone used that? Does it work?

I'll download the free trial and see what I think.

Bob
bobrocke replied on at Permalink Reply
bobrocke
I'm using the free 30 day trial of Navicat MySQL (it's $129) and have done a few limited experiments, syncing the database between my development server and my staging server. The good news is: so far, so good.

There needs to be more testing, and I'm definitely not ready to turn it loose on the production server, but the initial signs are good.

When I come to a conclusion, I'll post here.

Bob
bobrocke replied on at Permalink Reply
bobrocke
As a longer term follow-up, Navicat looks like the solution I'll go with. Although I still haven't tested it in a wide enough range of circumstances to make a blanket recommendation to others, it has worked fine (as long as I pay attention to which database is the source and which is the target) for me.