Backup Addon?

Permalink
Is there any useful addon to create backups? I know that c5 got a backup&restore feature in the Dashboard, but I'm looking for something more complex, maybe something that backs up more data (not only the SQL part).

Does anything exist for this use? The only result for "backup" search in addons is yet another free database-backup only

 
12345j replied on at Permalink Reply
12345j
you only need to backup the db really. Otherwise just download all the folders from your site and upload them to another- but the sql files won't be included in that, thats why you need the sql bakup addons.
JohntheFish replied on at Permalink Reply
JohntheFish
You may find this howto of interest:

http://www.concrete5.org/documentation/how-tos/developers/backup-a-...

It covers database backups and has some notes about backups for files and the overall installation. However, backing up the overall installation really depends on the facilities your web host provides.

I think you do have a good point about an there being a need for an addon that handles other backups. I think an addon that intelligently backs up the /files directory could be of use to many sites with frequently changing /files content.
Lintu replied on at Permalink Reply
Thank you for the reply and the link. We don't have any limits as we got our own data center and we are hosting our stuff alone. I was just hoping there would be an "intelligent" AddOn that could backup everything and zip it to a save place.

Thank you for your help though :)
mkly replied on at Permalink Reply
mkly
One of the problems is that a lot of people on shared hosts are going to "timeout" before the backup completes. So that is a big barrier to it getting developed.
Lintu replied on at Permalink Reply
Couldn't you make a phpscript that auto-refreshes itself every 10 seconds and do the backup in the background? this way you should be able to go around the php.ini timeout, no?
mkly replied on at Permalink Reply
mkly
Go for it.
Lintu replied on at Permalink Reply
Not sure how to make proper addons for concrete5, but I think it shouldn't be too difficult to create a standalone-script to backup databases with. I'll give it a go
mkly replied on at Permalink Reply
mkly
That's awesome. I'll be more than happy to answer any questions you have. My github username is mkly. And you can always pm me here. If you can figure out a script that gets around the timeout issue I would be more than happy to wrap it up in a package and give you all the glory.
Lintu replied on at Permalink Reply
I'll let you know when I got a stand-alone script working.

Would be great if you could get it wrapped into an addon then :) Not sure what githup is tho
Lintu replied on at Permalink Reply
I just wanted to start making a script and found this:http://de.php.net/manual/de/function.set-time-limit.php...


Wouldn't this be enough? Just extend the timeout time for the backup script. Only one line.. Or is there still a problem with this?
mkly replied on at Permalink Reply
mkly
Try it out and let me know.

You should skip the /file/cache directory as that isn't needed. You can attach the php file(raname it to .txt) to your post and I'll test it out on some shared servers I have access to.
Lintu replied on at Permalink Reply
I tested it and it works! I edited a core file of c5 though, so it's not a good solution. As said, I don't know how to make addons and stuff.

This is what I did:

I took the concrete/libraries/backup.php and added only one line at the beginning: "set_time_limit(600);"

This is supposed to let my script run for 10 additional minutes if the php.ini "max_execution_time" is over.

I went to the dashboard and made a backup, I get no more timeouts and the correct sql file appears where it should be, everything is perfect!


Now I have to edit the core file back though because I'm not allowed to edit core files of the CMS in the live environment for the project I manage. Could you help me to find a good place to put this line so that just the dashboard-backup page is executing this line WITHOUT editing core files? Can you make addons that only add one line or something to an existing script for Concrete5?
mkly replied on at Permalink Reply
mkly
Were you having timeout before? A lot of times shared hosts kill processes because they use too much memory. This does sound like it could be a good idea though.

Oh, I though you also wanted to back up the files? Either way we can break this out into an addon pretty easy and just install it.
mkly replied on at Permalink Reply
mkly
I was just thinking. If your only problem was the timeout. Why don't you just set max_execution_time to something larger in your php.ini file?
Lintu replied on at Permalink Reply
Its not the only problem. The basic "problem" is that we need a "better" backup addon to make intelligent and very good backups with only one click.

The timeout is only a minor issue within this
JohntheFish replied on at Permalink Reply
JohntheFish
There are php configured timeouts and there are web host configured apache timeouts. On a dedicated host you have control of both. On a shared host you may be able to exercise some control of the php timeout, but the basic apache timeout of the hosting package will sooner or later become the limiting factor and the web request will be stopped with a 500 error.

As you hinted earlier, the usual way round it is to break a long process down into chunks that each perform a little slice of the process, return to the browser and then run the next slice, either by depending on user traffic or by using a script timeout and refresh.

The disadvantage of this for backups is that it is no longer a perfect snapshot of the system. Because a complete backup is spread over a number of web page requests, other interleaving requests may have added or deleted files in the interim. This isn't an insurmountable problem, but it does complicate matters. Or perhaps the minor inconsistencies that such an approach introduces are acceptable in the grand scheme of things.

For your modified file, just copy it from the /concrete folder to the equivalent root folder. See http://www.concrete5.org/documentation/how-tos/developers/change-th...
Lintu replied on at Permalink Reply
Hello and thank you for your reply.

Indeed you got an important point there when you say that there could be updates between the "little backup pieces" and therefor we wouldn't have a perfect backup.

But how is it if you do the backup in one piece? The php script will still get all the tablenames and foreach of the tables it will get the data and the meantime passes nonetheway, doesn't it? So basically, is there really a difference? Someone could also make updates to the database within the "one-step-backup", because it takes more than 30 seconds.


And I'd also like to thank you for that link, I'm going to watch closer into it later. But I'm sure it will help me out :)
JohntheFish replied on at Permalink Reply
JohntheFish
If it all happened within one page, then locking and a single transaction is feasible. Spread in slices over many pages, locking and a single transaction is not feasible.

Even without locking/transaction integrity, the shorter a time a backup is made within, the less inconsistency can creep into it.

Having said all that, I don't know if the standard C5 built in database backup is within a single transaction. Or if a phpMyAdmin backup is within a single transaction.

Personally, I have found no reason to question the integrity of C5 database backups by either of these methods. As I noted earlier, what I would like as added functionality is an intelligent backup of the /files directory.
adajad replied on at Permalink Reply
adajad
I have two scripts running daily on my web server. One is backing up the db and one is taking care of my sites.

The db script gives me daily, weekly and monthly backups. The other script is a simple xcopy /E /H /D /C /Y /I with an exclude of the cache (basically a differential backup).

A _complete_ restore (of all my sites) takes about 7 minutes once Apache, MySQL and PHP is installed (assuming a total breakdown).

I'm on Windows server btw.

EDIT: No php scripts just ordinary batch and vbs scheduled to run every day.
Mainio replied on at Permalink Reply
Mainio
I also kinda think that the whole backup should be outside of the PHP logic. The problem with set_time_limit function is (as said) that some shared hosting providers are killing scripts that take too long to complete. Also, this function might even be disabled on some shared hosting providers, so it would have no effect to call set_time_limit(600), it wouldn't do anything.

I have a simple (linux) shell script that I'm using on account-based backups. So even in servers that have multiple sites, I'm running the cronjob seperately for each site. What it does is that it takes the database dump and whole tarball of the public_html (**/cache excluded). You can configure the script to run as frequently as you'd like to from cron and within the script you can also control how many old backups are stored. This works with any site, also for sites that do not use concrete5.

I think taking a backup of the files/ folder is not always sufficient for example if the client has an account with access to installing package updates. Sometimes package updates or concrete5 updates can break the whole site when it's good that you have a backup of the old packages and core folder as well.

IF you want to do this from PHP, the correct way (as said by JohntheFish) is splitting the job into smaller chunks so that the execution limit does not pass.


Br,
Antti / Mainio
mkly replied on at Permalink Reply
mkly
@Lintu

I think you are missing something here. There is absolutely nothing stopping you from writing this. Concrete5 is MIT licensed. And it would be awesome if you contributed it back to the community, but you can keep it all to yourself if you want. Or even sell it.

You certainly don't need to convince us of anything, as we do not hold any power over your ability to do this.

If you think it can be done and/or needs to be done, then by all means go for it. I would love to see a solution like this.
Mainio replied on at Permalink Reply
Mainio
Yeah, sure. I was not trying to put this thought down. Go for it if you want to!

Just wanted to point out these issues and the most obvious reason why anyone hasn't done this yet.

Also wanted to point out that even though the PHP code itself might be easy to write, it's also good to consider other issues related to this problem as well.
azvampyre replied on at Permalink Reply
azvampyre
http://www.concrete5.org/community/forums/installation/godaddy-backup-blues/

I tried to do it through a php script, but as it's been pointed out...times out after 10-15 seconds.
That's why I used shell and Cron. It has never failed yet on all sites (some over 4GB).

Here's the application I wrote to address the problem with Godaddy. I can change it for non-Godaddy hosts and make it a web based script as well. I was going to make it into an addon, but since it was rewriting the 'login/logoff' page (just to add a link to the backup page), I wasn't sure if it was allowed.