Sign in to follow this  
Glass_Knife

Backup Software for Small Teams

Recommended Posts

I work with a few people, and we need to back up our stuff.  The problem is that the boss is paranoid, so no cloud backup service will work.  The only things I seem to find are simple linux shell scripts that don't really work, or grand enterprise solutions that want $1,000s per terabyte.  

 

I pinged everyone in my contacts, but they all use some kind of cloud storage, and when I mentions we need to run without an internet connection, I just got blank stares and silence.  

 

Is anyone using anything cheap/free but reliable?

 

http://www.gamasutra.com/view/news/219462/Cloud_source_host_Code_Spaces_hacked_developers_lose_code.php

Share this post


Link to post
Share on other sites

The cheapest non-internet backup system is to just manually copy stuff to an external drive and then shelf the drive somewhere safe.

 

I was hoping for something a little more automated and a little redundant.

Share this post


Link to post
Share on other sites

tl;dr: Get Synology.

 

It depends what you want to back up, how much money you want to yank out, and what inconvenience you're willing to take.

 

Everything not-source is pretty much scratch data anyway, complete OS is restored from a DVD image when necessary.

 

Sources (i.e. your source repository) are best in a revision control system anyway which is a "kind of" backup in itself, though obviously you should have a reliable backup of the repo, too. I've done that with a subversion server on a linux box in the past, and running on a cheapish NAS all with self-made bash scripts for daily backups. Total shit.

In particular the latter (a cheapish WD NAS) turned out a nightmare in retrospective. Nightmare as in total data loss (good job the backup script worked, heh).

 

So this week, I got my Synology 4-bay station delivered, and it's just awesome. Plug in some disks and it makes them RAID with redundancy and with no questions asked, and no trouble. RAID is no backup and no replacement for backup, but it is one step in that direction already.

 

Besides the obvious (working as network share), diskstation is one click to have a Subversion (or Git) server running. One click to have it back up automatically to a second diskstation or another network share (or cloud), if you wish to have that done. Oh, and it's 100+ MiB/s read and write on the average, and it comes with software that manages our security cameras as well with no extra computer necessary...

 

The price tag is steep, but it's worth every cent in my opinion. Just buy two of these, one for hosting the repo, and another for backup.

 

For backing up "normal stuff" on a desktop computer (non-programmer) I can recommend Seagate replica drives. Totally unsuitable for a programmer's machine since the software is too intrusive, but for the average user (such as your wife) it's just great. Plug it in, and do not think about it any more. Backs up everything secretly without you doing anything, and even lets you browse file revisions if need be.

Edited by samoth

Share this post


Link to post
Share on other sites
At my last position we used Cobian Backup (I believe version 9). We just made nightly archives onto a server across the windows network and less-regular archives onto removable drives. It was/is free and was more or less adequate for a small ( < 100GB development files) operation. It was not always easy to find and fix things when they went wrong (disk space and name changes being the most common), but in general it worked - at least better than no backups at all.

Share this post


Link to post
Share on other sites
You could try our software - SyncBackPro. We've been in the business for 10 years. Looks like it would do what you're looking for. www.2BrightSparks.com.

Share this post


Link to post
Share on other sites

You could try our software - SyncBackPro. We've been in the business for 10 years. Looks like it would do what you're looking for. www.2BrightSparks.com.

 

This one looks like it may work.

Share this post


Link to post
Share on other sites

If you have a server that you'd like to restore to already, then Crashplan's app can back up reliably to just that machine. Of course they'd prefer you subscribe to their cloud, but it's absolutely not necessary. It will happily do LAN backups or removable media backups. If you want an all-in-one solution with hardware, Synology is where it's at.

Share this post


Link to post
Share on other sites

 

You could try our software - SyncBackPro. We've been in the business for 10 years. Looks like it would do what you're looking for. www.2BrightSparks.com.

 

This one looks like it may work.

 

 

Nope.  Windows only solution.  And yes, tape does not seem sexy at all...

Share this post


Link to post
Share on other sites

I'm inexperienced in this issue, so take this with a grain of salt: I use a NAS for my family's data and my own code - I use the free desktop application Cobian Backup for remote backup on an automated schedule.

 

For redundancy, I intend to backup every computer device to every other device and then also to the NAS - been too busy to sit down and get that setup. Still running into a problem getting one of the laptops to backup to the NAS, though - probably a Windows permissions issue or something.

Edited by Servant of the Lord

Share this post


Link to post
Share on other sites

Restoring data is easy, restoring applications is hard.  Data can be just a matter of copying a file back.  Restoring an application may involve having to totally rebuild a machine.  And if you have any moderately complex/non-trivial customizations, you may as well accept it - you'll never get it back the way it was.  So the solution is to back up everything.  The OS partition, application partitions, data, configurations, everything.

 

A recent and interesting development in this space, in a *nix environment, is Docker -- Its a system that uses Linux Kernel features to create portable, version'd, application packages complete with their own view of the filesystem and configurations. The idea is that you isolate the applications you rely on into a docker container and the entire configuration is isolated and can be easily transported between systems. You can put something as simple as a DNS server, something like a LAMP or development stack, or even an entire Linux desktop you can access remotely, inside. Configure and build it locally, then push it up to your cloud server if you want, transfer it to another machine, or just backup your containers and restore from catastrophic loss. When you need to reconfigure, you just boot into the container's shell, make your changes through a typical command-line interface, and then commit those changes and restart your container if necessary. Since the containers are versioned, you can also just back out to a working version while you repair any mistakes, or are running tests on your latest changes. You can also mount directories from the host, so you can isolate your app or stack from its storage if you want to back the storage up in more usual ways (say, for example, with a database or Git repository that you back up some other way.)

Edited by Ravyne

Share this post


Link to post
Share on other sites

In my opinion one of the most important properties a good backup system must have is that it's invisible --- unless you need it, you don't ever notice it's there at all.
 
Restoring data from a NAS (or restoring the NAS from another NAS or cloud) works just like that. You never notice something happens, but the data is there when you need it1. Synology works just like that, with a few mouse clicks, no obscure stuff to configure manually, and no need to pray that it will hopefully work.
 
Restoring applications works fine from a complete-system-partition image, which conveniently fits on a BluRay (or a DVD on a Linux system, if you're lucky). The only obnoxious thing is that  operating systems are somewhat dumb when it comes to isolating their crap from the user's stuff, though some are not as abysmal as others.
For Windows, being able to image the system means installing Windows and all programs on C and everything else, including personal folders, desktop, etc. somewhere else (like, D, duh) and changing redundant environment vars, and then hoping no program has C:\data hardcoded... or relying on NTFS mounting another partition in an empty folder (like C:\Users) for which you have to boot from BartPE or similar first to copy over the contents and clear the directory.
For Linux, it means, in the easy case, having / with everything-not-usr on one partition and mounting /usr on another partition, which is not too much hassle, and invisible once set up. Well, in theory, that is (practice may look slightly different).

 

 
1 Unless you have a cheap WD NAS, as I can testify. Mine stopped showing the switchboard at some point without me doing anything (yes, yes,... I know, but I really didn't do anything -- presumably something broke during one of those 4-in-the-morning secret, obscure automatic updates?) Luckily it still "worked as normal" otherwise and I still had SSH access, so no problem, right? Now one day I needed FTP, so I tried to start the FTP server, but /etc/init.d/ftpd only returned "nah, sorry... 'tis disabled". No obvious way to enable it without clicking the little box on the Switchboard other than dig through a few thousand undocumented WD files. Great. So I tried to get a switchboard running again.

The updatefirmwaretolatest.sh script failed with an error, and apt-get wouldn't do the job either. Trying to download the firmware and installing it manually didn't work either (WD shows a download link, and when you click it, they tell you "yeah sorry, updating firmware manually isn't supported, but you can talk to a CS guy if you like".
So eventually I stumbled across restoredefaults.sh. Sounds good, doesn't it? Though it probably won't fix the issue, it can't hurt, and it's worth a shot, since by default the switchboard is shown and the FTP server is running. Yeah.

Good job that by default, SSH is disabled, too... so that left me with a brick that had neither SSH nor Switchboard, and that only showed the default "public" share on SMB.

 

My work PC has a removable HDD mount frame, so no problem recovering the data ... opened case, took out harddisk. put it in, and booted up Knoppix only to find out that you cannot mount the disk because some fuckhead at WD thought that using ext4 with a sector size of 64kB was an awesome idea (seriously, why would one want to do that, except for the explicit purpose of annoying people!). So I installed Ubuntu on an USB stick so I could mount the disk with fuseext2, and guess what. For WD "restore defaults" means among other things "format the fucking data partition".

Edited by samoth

Share this post


Link to post
Share on other sites

and it's 100+ MiB/s read and write on the average
Woah, I just realized what a dumb fuck I am. You can configure the synology box to work as iSCSI target with 3-4 clicks, and it integrates into Windows as if it was a built-in drive (well of course, that's kind of the point of iSCSI).

 

As a side effect, those 100+ MB/s read and write magically turn into 250 MB/s on the average (and are still at 50+ MB/s when doing nightmare stuff like copying 40,000 header files, something that causes SMB to degrade to something like 2-4 MB/s). Of course, the filesystem write cache might be playing into this, too, since doing a single-4GB-file copy runs at 400+ MB/s, which is 150 MB/s more than what MPIO can (in theory) deliver on my network. But it's still awesome.

Share this post


Link to post
Share on other sites

The most simple, effective and free solution is rdiff-backup (linux solution based on rsync but via network you can backup your whole windows office). You simply tell it which files/folders to backup and it creates a full-backup if first started and then only patches/diffs for each file (with a command you can also make full backups afterwards like for example every week). You can easily at any time recreate any version of any file/folder (simply by providing date+time with a command). It's also pretty fast. Just get a cheap linux box (if you don't have one already) with big hdds, hook it up to the network, write the command with folders and files in a shellscript and add an entry to crontab for automated backups and you're set. All diffs are stored in a folder in the to-backup folder in a readable, timestamped format so it's also easy to find the right version.

Edited by ilreh

Share this post


Link to post
Share on other sites

What if your office burns down?

Consider backing up all your stuff to an external hd at least once a week and store it outside your office-building. You could use a self-storage business for this or alternatively store the hd in a safe-deposit box in a bank (be aware that their opening hours may prohibit you from getting to your backup on weekends/in the evening).

It should not only include your current stuff, but your whole git-repository and so on.

 

This "external" backup should not replace your in-house backup at all, it complements it and is needed for a worst-case scenario.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this