Jump to content

  • Log In with Google      Sign In   
  • Create Account


Version Control and programming with a team


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
20 replies to this topic

#1 deavisdude   Members   -  Reputation: 107

Like
0Likes
Like

Posted 12 November 2013 - 05:02 PM

Hi, I'm quite obviously new to software development in general, but a group of friends and I would like to develop a video game, however we have no idea how to work on the same project at the same time.  I have looked at things like Github and sourceforge, but we don't quite understand the whole thing and would prefer not to have our project open source.  Currently, I just have the workspace in a shared dropbox folder, but I don't think that is going to work once we really start working on this all of the time.  If there is somewhere I should look for more information, I'd really appreciate a link.


Edited by deavisdude, 12 November 2013 - 05:02 PM.


Sponsor:

#2 wintertime   Members   -  Reputation: 1640

Like
4Likes
Like

Posted 12 November 2013 - 05:11 PM

There are other websites that give you private repositories without paying, one example is Bitbucket. If you never used version control you will find Mercurial/hg easier to use than git. git does basically the same with much more complicated commands that allow a bit more freedom with advanced features, without configuring it first to allow them.


Edited by wintertime, 12 November 2013 - 05:13 PM.


#3 frob   Moderators   -  Reputation: 20085

Like
1Likes
Like

Posted 12 November 2013 - 05:22 PM

You are right about getting a proper version control system.  Not using version control feels like jumping from a plane without a parachute. The ability to roll back in time and look at the full history of the project is a very important thing.

 

If you are just looking for the academic side, wanting to know what it is, what it does, where it has been, and where it is going, there is a pretty good writeup from a few years back by Eric Raymond (a superstar in the open source world).

 

You first need to decide what kind of version control system is best for your group.  There are many variants and they are all described in that article above.

 

Once you have that, you need to have a machine that can host it.  You can buy or rent computer time from a dedicated host. If you configure it correctly you can also host it yourself on someone's home machine. 

 

Then you just do it: Get the software, install it, configure it, and run it. 


Check out my personal indie blog at bryanwagstaff.com.

#4 snacktime   Members   -  Reputation: 291

Like
4Likes
Like

Posted 12 November 2013 - 11:34 PM

Github has private accounts, it's used by thousands of companies developing commercial software.  I think it's under $10 a month which gives you several private repo's.

 

One thing to remember is that github != git.  Github has a workflow and a lot of tools to make your life easier, and there are a ton of third party services that hook into github.  As far as platforms as a service for source control, no one really touches github, and you have to factor that in to your larger decision on which source control software to use.

 

Beyond that I would just say stick with a distributed source control system like Mercurial or Git.  If you want to be able to manage different branches and merge code without pulling your hair out, don't touch Subversion.



#5 frob   Moderators   -  Reputation: 20085

Like
2Likes
Like

Posted 13 November 2013 - 12:05 AM


I would just say stick with a distributed source control system like Mercurial or Git.  If you want to be able to manage different branches and merge code without pulling your hair out, don't touch Subversion.

There are benefits and drawbacks to each solution. 

 

While the ability to branch easily is often cited as a benefit, discovering that you suddenly have 1.2GB to sync and store because of a few binary file branches is certainly a drawback. It isn't just the download time, it is also the fact that even 2TB and 3TB drives run out of space way too quickly these days.  If you work on a very large game that contains a few hundred gigabytes of raw assets the storage requirements of distributed systems can quickly grow into multiple terabytes. When you enter that type of model, the relatively high cost of Perforce becomes cheap relative to the indirect costs you encounter with DVCS.

 

Every feature is a tradeoff. Every system is different. For example, distributed version control has benefits of data locality and easier branches, and comes with a potentially serious cost of storage and sync, added complexity to keeping everyone together, and the mental cost of keeping track of which branch does what. Your project might be better with one option or with the other.

 

Choosing the best version control sultion is an area where there are many options that are tailored to different development styles. Learn about them all and select carefully.


Check out my personal indie blog at bryanwagstaff.com.

#6 wintertime   Members   -  Reputation: 1640

Like
0Likes
Like

Posted 13 November 2013 - 04:42 AM

Wouldn't the size problem be reduced if you avoid putting continuously updated precompiled binary files into git/mercurial and dont add all kinds of external stuff you depend on into a single repository but instead have some sub-repositories?

You could also split it up to put the source art files, the compiled art files, the compiled code into 3 svn repositories, to reduce the load in the dvcs further by only having source code inside. Or add some custom merge tool for some binary files or convert them when storing into the repository to make them mergeable.

Maybe on bigger projects people have some buildserver already that compiles everything for testing commits, that could also update the separate svn repositories for compiled things, for the artists to avoid having to compile code and the programmers avoiding to convert art files themself?



#7 LorenzoGatti   Crossbones+   -  Reputation: 2659

Like
2Likes
Like

Posted 13 November 2013 - 04:45 AM

You can run Git on your web server, with authentication to restrict repository access to your colleagues. It isn't difficult; I set up Git as a CGI (using Apache on Windows) with SSPI authentication (Windows network domain) and a whitelist of allowed Windows users without any trouble.
Produci, consuma, crepa

#8 LorenzoGatti   Crossbones+   -  Reputation: 2659

Like
1Likes
Like

Posted 13 November 2013 - 05:28 AM

Wouldn't the size problem be reduced if you avoid putting continuously updated precompiled binary files into git/mercurial and dont add all kinds of external stuff you depend on into a single repository but instead have some sub-repositories?
You could also split it up to put the source art files, the compiled art files, the compiled code into 3 svn repositories, to reduce the load in the dvcs further by only having source code inside. Or add some custom merge tool for some binary files or convert them when storing into the repository to make them mergeable.
Maybe on bigger projects people have some buildserver already that compiles everything for testing commits, that could also update the separate svn repositories for compiled things, for the artists to avoid having to compile code and the programmers avoiding to convert art files themself?


Compiled binary files have no reason to be in SOURCE control. Important ones, such as public packages of release or beta test versions, should be archived to Dropbox, network shared drives, or other types of "dumb" storage by build managers to let others download them as needed.
Old compiled binary files, as opposed to compiling the game by themselves, are needed by developers only for unusual reference purposes (e.g. testing that building the appropriate revision of the sources reproduces what's been released), not to be routinely updated, compared, and be copied to or from their personal workspaces like sources.

Correctly managed binary source assets are unlikely to be troublesome: they should be available to everybody (enabling all developers to build the game), which is a good reason to make all revisions easily available, and they should change only rarely and in meaningful fine-grained increments (for example, repainting a 3D model should change that model's texture maps, not a big texture atlas containing lots of unrelated images; the big texture atlas can be kept outside source control and rebuilt automatically).
A sane organization of assets and build tools is an opportunity, not a cost incurred because of source control; setting up an easy, automated and effective workflow using source control should be compared to skipping the initial effort, sinking into progressive complication and confusion, and throwing the towel (or wasting a lot of time) because of errors in manual builds.
Produci, consuma, crepa

#9 Trienco   Crossbones+   -  Reputation: 2110

Like
0Likes
Like

Posted 14 November 2013 - 10:52 PM

I'll go with wintertime when it comes to git. Use it for source code (and other plain text formats) only. Large repositories can make everything annoyingly slow and git can't really shine with un-diff-able binary files anyway. Place binary assets somewhere else, even if it can be a pain to keep source and assets in sync this way (make sure whatever you pick makes it easy to get a version/snapshot based on date-time, just in case).

 

Or (because one might argue that huge amounts of binary assets are a major part of games), don't use git for game development.


f@dzhttp://festini.device-zero.de

#10 Yrjö P.   Crossbones+   -  Reputation: 1412

Like
2Likes
Like

Posted 15 November 2013 - 07:40 AM

There are other websites that give you private repositories without paying, one example is Bitbucket. If you never used version control you will find Mercurial/hg easier to use than git. git does basically the same with much more complicated commands that allow a bit more freedom with advanced features, without configuring it first to allow them.

Also, Github has student accounts. They allow you to make up to 5 private repos. You just need to mail Github to apply.

I also currently have the dilemma of what forms of version control to use. Complicating the matter:
- The project only runs a couple more weeks. No time to spend on infrastructure.
- Most of the team are not programmers. I'm the only one with some experience of using git/hg/svn etc., and then only with text data, not binary blobs.
- Due to the skillset and time factors, I can't ask others to learn much VCS.
- We use Unity3D, and projects in it do not play well with version control.
- At the end we'll have quite a bit of assets.

We went ahead and used git + GitHub for now, but it's so awful in these circumstances, I'm still wondering if we should just forget the repo and just dump the whole thing in Dropbox. Or what other VCS we should be using instead. I'll have another short project after this, so I really want to figure out a solution even if I don't have time to apply it for the current project.

When it is easy to separate the assets from the code, and only programmers need to touch the VCS, git and hg are great. I'd recommend Hg for newbies, Git has a very long learning curve.

#11 wintertime   Members   -  Reputation: 1640

Like
0Likes
Like

Posted 15 November 2013 - 09:59 AM

Yeah those artists would have had a much easier life with hg. Start working, "hg pull", then change something, "hg commit", done. Only "hg add" if you create some new files, no forgetting to stage something like with git, no weird ". " in add command, no quirky extras in commands and so on. "hg push" at end of work and done.wink.png



#12 frob   Moderators   -  Reputation: 20085

Like
0Likes
Like

Posted 15 November 2013 - 12:56 PM

Compiled binary files have no reason to be in SOURCE control. Important ones, such as public packages of release or beta test versions, should be archived to Dropbox, network shared drives, or other types of "dumb" storage by build managers to let others download them as needed.

 

This is a common debate. I'm on the opposite side.

 

I'm a very strong believer of putting basically everything in version control.

 

My first test is the 'new machine install' test. I believe we should be able to take a freshly formatted machine, run a single bulk install script to put on visual studio or maya, python, perl,  and all other software needed for a programmer/modeler/animator/tester configuration. Then I should be able to sync to version control and be done.  This means all the third-party libraries and packages should be available through version control. If that doesn't work, I feel it is a defect to be fixed.

 

My second test is the 'instant demo' test. I believe we should be able to take a freshly formatted machine, run a single bulk install script, and pull down any one of a large number of builds. These should include all the nightly builds (including all debug information) for at least one month, and every build ever released outside the building (including all debug information). The list of available builds can easily go back for many years on a long-term project.

 

In short, we should be able to turn any machine into a demo machine, a test machine, or a development machine in just the time it takes to transfer the files and run the minimal number of installers. We should be able to re-create the exact build environment for any build over the history of the project using only an OS install disk and a version control installer and nothing more.  

 

 

Over the years I have learned it is best if this is kept in a single location inside version control, rather than spread out across many network locations such as your described dropbox and network shares. Even more so for data kept in an external database -- don't do that, instead extract the data you need and store that copy of the data in version control. Since you can easily purge files from version control there is almost no space difference. It doesn't happen often, but it happens often enough that you need to re-create everything as it existed years ago. Having EVERYTHING in version control including binary files and installation files is perhaps the most reliable way to meet that need.

 

You do not know when you will need to pull a very historical build, including all of its debug information. We had one instance in a long-running game that we needed to re-create some of the libraries as they existed about eight years before. Our repository had well over a million submissions, but we were able to go back to that very old build, completely re-create the build environment of VS2002 and Windows 2000, install all the build tools and pipeline, and otherwise regenerate everything as needed. I'm confident we could not do that if we didn't keep EVERYTHING in version control. Having the ability enabled the studio to take advantage of a multi-million dollar deal.

 

There are of course people who disagree with that policy of putting everything in version control. Their logic escapes me and I usually attribute it to a lack of experience in actually NEEDING historical builds.  For now it is enough to say it is a common enough disagreement point around the Internet and it is something people strongly disagree about.


Check out my personal indie blog at bryanwagstaff.com.

#13 3Ddreamer   Crossbones+   -  Reputation: 3129

Like
0Likes
Like

Posted 15 November 2013 - 01:01 PM

I have not found a software for VC that has all the characteristics for me of full functions, agile navigation, and customization without being a jungle to learn.

 

 

 

Clinton


Personal life and your private thoughts always effect your career. Research is the intellectual backbone of game development and the first order. Version Control is crucial for full management of applications and software.  The better the workflow pipeline, then the greater the potential output for a quality game.  Completing projects is the last but finest order.

 

by Clinton, 3Ddreamer


#14 LorenzoGatti   Crossbones+   -  Reputation: 2659

Like
0Likes
Like

Posted 16 November 2013 - 06:20 PM

 

Compiled binary files have no reason to be in SOURCE control. Important ones, such as public packages of release or beta test versions, should be archived to Dropbox, network shared drives, or other types of "dumb" storage by build managers to let others download them as needed.

 

This is a common debate. I'm on the opposite side.

 

I'm a very strong believer of putting basically everything in version control.

 

My first test is the 'new machine install' test. I believe we should be able to take a freshly formatted machine, run a single bulk install script to put on visual studio or maya, python, perl,  and all other software needed for a programmer/modeler/animator/tester configuration. Then I should be able to sync to version control and be done.  This means all the third-party libraries and packages should be available through version control. If that doesn't work, I feel it is a defect to be fixed.

 

My second test is the 'instant demo' test. I believe we should be able to take a freshly formatted machine, run a single bulk install script, and pull down any one of a large number of builds. These should include all the nightly builds (including all debug information) for at least one month, and every build ever released outside the building (including all debug information). The list of available builds can easily go back for many years on a long-term project.

 

In short, we should be able to turn any machine into a demo machine, a test machine, or a development machine in just the time it takes to transfer the files and run the minimal number of installers. We should be able to re-create the exact build environment for any build over the history of the project using only an OS install disk and a version control installer and nothing more.  

 

 

Over the years I have learned it is best if this is kept in a single location inside version control, rather than spread out across many network locations such as your described dropbox and network shares. Even more so for data kept in an external database -- don't do that, instead extract the data you need and store that copy of the data in version control. Since you can easily purge files from version control there is almost no space difference. It doesn't happen often, but it happens often enough that you need to re-create everything as it existed years ago. Having EVERYTHING in version control including binary files and installation files is perhaps the most reliable way to meet that need.

 

You do not know when you will need to pull a very historical build, including all of its debug information. We had one instance in a long-running game that we needed to re-create some of the libraries as they existed about eight years before. Our repository had well over a million submissions, but we were able to go back to that very old build, completely re-create the build environment of VS2002 and Windows 2000, install all the build tools and pipeline, and otherwise regenerate everything as needed. I'm confident we could not do that if we didn't keep EVERYTHING in version control. Having the ability enabled the studio to take advantage of a multi-million dollar deal.

 

There are of course people who disagree with that policy of putting everything in version control. Their logic escapes me and I usually attribute it to a lack of experience in actually NEEDING historical builds.  For now it is enough to say it is a common enough disagreement point around the Internet and it is something people strongly disagree about.

 

 

You describe good practices, but for the purpose of this discussion the tools and libraries and "everything" you rightly keep under version control are source files (i.e. files needed to build the software): replacing a compiler with an updated one has the same effect as editing your own files, i.e. changing a configuration that should be recorded in source control because it affects the build results.

 

What has no need to be in source control are the build products, because they are never edited, replaced etc. like source code or tools; you describe situations, like setting up a test machine with a specific build, in which pulling builds from a revision control system is a convenient tool, but it remains essentially different from how actual source code is used and managed. The same level of automation and reliability can be obtained in other ways, among which shelving and forgetting build products until needed for auditing purposes as I suggested is usually more than is needed.

 

Even tools can be managed differently: at my workplace there are a number of read-only virtual machine images on shared drives, each configured with the tools for a different ongoing project and cloned from the result of careful manual setup; they are copied to, and run on, each developer's computer, letting people switch easily between projects without conflicts, start work on a new project easily, and reset corrupted environments to a known good state. Actual sources stay on a normal VCS. If you want to reproduce a system configuration, virtual machine images are far more reliable than rerunning install scripts and hoping the result is the same.

 

Keep in mind that the discussion includes people who are scared of proper version control for binary assets like images, or proper version control in general; don't ask too much of them.


Produci, consuma, crepa

#15 3Ddreamer   Crossbones+   -  Reputation: 3129

Like
0Likes
Like

Posted 16 November 2013 - 10:22 PM

We know that there are two approaches to management here, version control and source control.  Both combined should be considered under software management as the total strategy for a complete grip on things, in my opinion. In some cases for developers both version control and source control are integrated by preference, especially in large software development firms.  Is this not true?

 

  At this point, I have not found a software which satisfies me to combine both version control and source control under one management software structure. I will probably be forced to do what I do in 3D art creation and put two different software (one for source control and one for version control) in the workflow pipeline for coding. rolleyes.gif

 

Who here has found one software for version control and another for source control (working well together) which is more doing than learning?  Please let me know. smile.png   


Edited by 3Ddreamer, 16 November 2013 - 10:24 PM.

Personal life and your private thoughts always effect your career. Research is the intellectual backbone of game development and the first order. Version Control is crucial for full management of applications and software.  The better the workflow pipeline, then the greater the potential output for a quality game.  Completing projects is the last but finest order.

 

by Clinton, 3Ddreamer


#16 frob   Moderators   -  Reputation: 20085

Like
3Likes
Like

Posted 16 November 2013 - 11:39 PM

  At this point, I have not found a software which satisfies me to combine both version control and source control under one management software structure. I will probably be forced to do what I do in 3D art creation and put two different software (one for source control and one for version control) in the workflow pipeline for coding.

 

Perforce.

 

 

If all you are storing is text files then get is fine. Git starts having performance issues around the 15GB mark. If all you have is text files then you won't hit that for a several years even in a corporate environment. But as you point out when you start having a large volume of frequently changing binary assets, well, you can hit the 15GB mark pretty quickly.

 
Perforce is expensive, it is huge, it is centralized, and it has a learning curve, but is the game industry standard for many excellent reasons (Google can describe them in depth). Perforce isn't just the games industry favorite. Microsoft uses Perforce as the core of their Source Depot that holds the core of Windows and Office and has done so for a decade. Google claims to use Perforce for over 75000 projects. 

 

 

The current revision of our game's source and assets is about 300GB when downloaded. That includes not just a few hundred thousand source falls, but also some hundreds of thousands of PSDs, tens of thousands of Maya files, a massive audio clip library, and much, much more. Since we store everything in Perforce, our game's decade-long history has over 100TB because of all the binary assets since forever. Good luck distributing that over git. Across the studio we have many projects each with their own perforce server. We share a huge amount of cross-studio development work, where perforce proxies act much like a distributed version control with local copies

 

As for integration with other management structures, there are excellent integration tools for Perforce on just about every management system from JIRA to Mantis. 

 

If you are a hardcore git fan, you can still use perforce --- it allows git connections where users can access a perforce clientspec as a git repository. They can do all the git work they want and when the merge it back in it goes back to the perforce mother ship. You are free to isolate your own little view of the data and pretend it is a nice small git project even when the complete project would be beyond git's abilities.

 

Like all systems perforce has its warts, and I normally don't recommend it for individual projects. It is expensive and difficult to learn. But if you need something to handle a whole lot of everything, then Perforce might be your answer.  


Check out my personal indie blog at bryanwagstaff.com.

#17 3Ddreamer   Crossbones+   -  Reputation: 3129

Like
0Likes
Like

Posted 17 November 2013 - 12:09 AM


If you are a hardcore git fan, you can still use perforce --- it allows git connections where users can access a perforce clientspec as a git repository. They can do all the git work they want and when the merge it back in it goes back to the perforce mother ship. You are free to isolate your own little view of the data and pretend it is a nice small git project even when the complete project would be beyond git's abilities.

 

Bulls-eye!   That is exactly what I need to know.  There is a journey of a couple years before I would need Perforce, but since Git is my current thing, that looks very promising.

 

Thanks, frob


Personal life and your private thoughts always effect your career. Research is the intellectual backbone of game development and the first order. Version Control is crucial for full management of applications and software.  The better the workflow pipeline, then the greater the potential output for a quality game.  Completing projects is the last but finest order.

 

by Clinton, 3Ddreamer


#18 Yrjö P.   Crossbones+   -  Reputation: 1412

Like
1Likes
Like

Posted 17 November 2013 - 06:37 AM

Fundamentally I'm very much on the side of putting everything in (the same) version control, and being able to reliably produce a working game by just pulling from repo and building.

Part of my problem is that Unity3D seems to require or strongly encourage mixing code and data. I'm not sure if it's feasible to really clean it up, and to what degree.

If only our game code and data were properly separate, and we were able to handle missing assets sensibly on the code side, life would become a lot easier. Then we could use git, hg or whatever for the code, any other solution for the data/assets (something that is easy for the artists), and the project would still be trivial to build and work on. Specific asset sets could be designated as "official" once in a while, and the build script in the code VCS would fetch the official asset set at the time of that code version.

Edited by Yrjö P., 17 November 2013 - 06:40 AM.


#19 thok   Members   -  Reputation: 681

Like
0Likes
Like

Posted 17 November 2013 - 11:06 AM

I think you guys are getting a bit off topic, considering the scope of the OP's question and level of experience.

deavisdude, as was already suggested, have a look at bitbucket; it supports git and mercurial repos. Private repos are free. Pick either git or merc--doesn't matter which one.

#20 DocBrown   Members   -  Reputation: 273

Like
0Likes
Like

Posted 19 November 2013 - 11:14 AM

A suggestion if you're going to be using Visual Studio for development (not sure if you are, as it hasn't been discussed).  Microsoft offers Team Foundation Service/Visual Studio Online with it's newest Visual Studio versions that allow for both version control, and task allocation depending on what development model you'll be using. 

 

I may also suggest that your team decides on what development methodology(SCRUM, Kanban, Waterfall, etc) - this will greatly determine how you setup your project development/version control system.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS