Game content source repository?

Started by
15 comments, last by snacktime 9 years ago

I do use DropBox, but not for my assets. I don't need diffs of my assets, but prior versions are maybe a good idea - I haven't ever had to go back and retrieve earlier versions though.

Advertisement
If you pay, Dropbox does give you the ability to retrieve old versions. We use it for our source assets at the moment because it's so easy to use, and easy to collaborate with outsourced contractors - no servers/passwords/VPNs for us to manage ourselves.
It does make me anxious though, having another company promising to keep all our core data safe though... I'd recommend local backups/archives in case anything goes wrong!

One downside to dropbox's ease of use, is that it's rediculously easy for someone to delete everything. If they just try to move the "Assets" dir out of "D:/Dropbox" (etc) then it's the same as deleting each asset one by one. Everyone else on the team will start seeing their files disappear as Dropbox dutifully replicates those changes!
Correct me if I'm wrong, but doesn't Dropbox move all remotely deleted files into a local cache directory? At least at some point they did it like that. I noticed that because I used Dropbox to send some largish files to someone else with the recommendation to just delete them when they were done. After a while I noticed the partition in question getting awfully full until I found that cache directory...
We use perforce here, on a large project with multiple teams around the world, to store both the code and the data.

Some things that make perforce good for this are:

- the possibility to have different locking policies per file type (you want to allow multiple checkout for sources, but exclusive checkout only for binary data such as art assets)

- the possibility (again, per file type) to limit the number of revision for which you keep the actual data. For instance, you can get it to store only the last 5 revisions of PNG files and discard earlier ones. This is vital for very large project s that really deal with a lot of data, to keep the size of the repository under control.

- perforce allows to setup proxy servers, and it works really well and allows the dozens or hundreds of people working at each studio to just talk with their local proxy, which in turn incrementally synchronize themselves with the central repository. This way the (large) data being committed elsewhere in the world is only downloaded once by your local proxy, and then everyone gets them on their PC through the lan. Imagine if a team of 100 persons had to download the same latest art assets through the internet directly...

Despite of all this it is very responsive in practice, someone on the other side of the world pushes a commit through their local proxy and you see it almost immediately. Of course when large operations are underway such as branching or large commits it tends to create some slow downs but nothing really crippling.

I keep all my source code on a server machine hosted in my utility room.

It runs 24/7 and is a debian linux based machine running Git and Atlassian Stash. I also have some other bits and bobs on there like a mail server, apache2, etc.

I've used cvs originally (absolutely atrocious) and then subversion for many years, only just making the move into Git.

I tend to keep all my assets and the game code together in the same repository usually and just split it by directories, there isn't much need to split it into seperate repositories as it stands with the size of the project. I can see how you might want to on larger projects though, to avoid for example cluttering up the artist's hard disk with source code he won't ever use...

I use Dropbox and SVN. My project dir is in my Dropbox folder so I get all the benefits of cloud storage. We also use SVN with Tortoise cause I like to know who did what and diff management. I have my own Redmine server which has SVN installed on it. I use Turnkey Linux - Redmine. Once it's setup it's awesome.

Github is coming out with large file support, and there is a commercial git based platform that supports large files, can't remember the name. It's based on git-annex. So the future is looking brighter.

Even though perforce handles binary/large files really well, I just can't give up github workflow for code. So I've been using amazon S3 for large binary files. Managing versions gets tricky when working in teams. Let me rephrase, it outright sucks. But it's been manageable.

This topic is closed to new replies.

Advertisement