• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.
Sign in to follow this  
Followers 0
phr34k9

databases as asset-storage

17 posts in this topic

I am aware that in larger projects for the development cycle it might be favorable to store assets in centralized database. For the most part I understand the reasoning and motivations to do so, and concluded that the workflow it encourages I want to give it a shot. However not having experience on the subject I was wondering what a good setup would be.

One of my main concerns is how data stored in the database should be used by the game runtime. I try to strive for 'realtime-editing' so what I am considering is a network interface that simply lets you do restful operations to query or modify the data and has notification api. This way any client could at any given time query for an up-to-date asset given that it is in the repository.

Given that assets tend to contain large volumes of data issuing a operation to fetch the data for every single time the sandbox starts might put some serious strains on network-infrastructure so probably some architecture is in place to cache frequently used assets into replicated asset-bundles.

Overall how does this strategy sound? To my recollection there aren't any published articles or presentation about such subjects are there?
0

Share this post


Link to post
Share on other sites
Hmmm...welll.. using a centralized repository with a real-time client to manipulate data is...bu...not funny at all . My experiences with this kind of tools are horrible (solo = no problem, 2 = few problems, >2 = trouble explosion!).

Just use some kind of version management with centralized repository like SVN where you are able to check out/in, lock and revert changes. It is simple like that, althought subversion(SVN) is not meant as asset managment tool, it is free and powerful, thought there are commercial tools for asset managment out there which could do the task better, when you want to spend some $$$.

Tools are important, but dont dig into tool creation when there're already tools available, create some games instead [img]http://public.gamedev.net//public/style_emoticons/default/tongue.png[/img]
0

Share this post


Link to post
Share on other sites
It's hard to say without knowing anything about your game. If you'd care to give us a summary of your architecture and design so far, we may be able to point you in the right direction.

With regards to data storage and caching, you could look into NoSQL datastores. Keep a relational database around for durable storage and utilize a NoSQL store (such as Redis or MonogoDB--there are many different stores; choose the one right your task) for fast access to your cache.
0

Share this post


Link to post
Share on other sites
[quote name='Ashaman73' timestamp='1328868401' post='4911609']
Hmmm...welll.. using a centralized repository with a real-time client to manipulate data is...bu...not funny at all . My experiences with this kind of tools are horrible (solo = no problem, 2 = few problems, >2 = trouble explosion!).

Just use some kind of version management with centralized repository like SVN where you are able to check out/in, lock and revert changes. It is simple like that, althought subversion(SVN) is not meant as asset managment tool, it is free and powerful, thought there are commercial tools for asset managment out there which could do the task better, when you want to spend some $$$.

Tools are important, but dont dig into tool creation when there're already tools available, create some games instead [img]http://public.gamedev.net//public/style_emoticons/default/tongue.png[/img]
[/quote]

Did I read this right? Using SVN as a database? Dear God. How do you propose to format the data? More importantly, how do you query it?
-3

Share this post


Link to post
Share on other sites
[quote name='phr34k9' timestamp='1328867629' post='4911608']

One of my main concerns is how data stored in the database should be used by the game runtime. I try to strive for 'realtime-editing' so what I am considering is a network interface that simply lets you do restful operations to query or modify the data and has notification api. This way any client could at any given time query for an up-to-date asset given that it is in the repository.

[/quote]

This particular paragraph in your question gave me an idea. You may be able to use messaging technology (AMQP, for example) to send and receive notifications about data changes. If I understand your problem correctly, I can sketch an idea of how this would work.

User A opens record 7 for viewing
User B also opens record 7 for viewing
Both users 'subscribe' to updates about record 7
User A makes an edit -> notification is broadcast to all record 7 subscribers
All subscribers to record 7 updates can react appropriately to the update (query a datastore, display a message, whatever)

Have a look at RabbitMQ: http://www.rabbitmq.com/
0

Share this post


Link to post
Share on other sites
So far it's technically not a game but more like an engine infrastructure to facillitate collabarative open world development. Given the wide variaty of assets, decals I see the potentional need to manage assets on a order of magnitude of a factor 4 or 5 (1000 / 10.000). Which is one of the reasons why I am leaning into centralized servers for management, replications, versioning. What I envision is in essense a web-interface that empowers anybody to submit assets, manages dependecies between the assets, prevent complicated machine setups.

To elaborate a bit on my architechture on your request: I have abstracted away filepaths and direct file-io. At this moment this is nothing more than replacing a 'asset uri' to a physical location on the harddrive. This effectivly simulates mounted folders.

Small example could be: {CB3158AC-D1C6-41B5-8006-221E326A7284} is mapped to Z:\Content Packages\content. And my own fopen, and FileStream classes take care of mapping to the appropiate file. Similair to a drive letter the guid identifies the repository, and meta-data in the folder structure would allow me to identify the file-system technology i.e. Native OS Filesystem, NoSQL, BerkleyDB. Goal of this abstraction was origionally to allow remounting of asset repositories so that you can easilly switch where you'ld want to store the assets, could be a ntfs share, another disk drive. A bonus is that these repositories could be shared between games that are development.

[quote name='thok' timestamp='1329004795' post='4912110']
This particular paragraph in your question gave me an idea. You may be able to use messaging technology (AMQP, for example) to send and receive notifications about data changes. If I understand your problem correctly, I can sketch an idea of how this would work.
[/quote]

Your suggestion is much appreciated but my concern was not perse how to receive notifications but more on the lines of what the game runtime has to do with the data it receives. What caching policies to apply? How should it store it? Example could be to 'cache' assets in a client-side database, but what implications does it have for load/search times of assets? Would this overall be neglectable or would it bring the idea closer in the category of 'looks good paper, will never work in practice'?
0

Share this post


Link to post
Share on other sites
[quote name='phr34k9' timestamp='1329040340' post='4912224']

Your suggestion is much appreciated but my concern was not perse how to receive notifications but more on the lines of what the game runtime has to do with the data it receives. What caching policies to apply? How should it store it? Example could be to 'cache' assets in a client-side database, but what implications does it have for load/search times of assets? Would this overall be neglectable or would it bring the idea closer in the category of 'looks good paper, will never work in practice'?
[/quote]

Ah, I misunderstood the core issue, then. My answer for now would have be more questions:
- What are some specific use cases/scenarios for this system?
- Can users lock an asset for editing, or can anyone potentially edit anything (creating also the potential for merge conflicts)?
- Is your data plaintext, binary, or both?
- In your initial post, you mentioned "[storing] assets in a centralized database". Were you referring literally to a (relational) database, or were you referring generically to some other type of data repository?

Perhaps I shouldn't have been so quick to shoot down Ashaman73's SVN idea, at least with respect to version control. It sounds like you're describing a situation where a distributed version control system __might__ apply. SVN is not distributed of course; git, however, comes to mind as a possible solution. Pretty much all of your concerns about caching, client-side databases, and even performance should be no problem for git. It's very fast. Assuming a DVCS actually applies to your problem, of course....
0

Share this post


Link to post
Share on other sites
I've worked with asset databases before, and they come with their own set of problems.

Yes, there are some benefits. The real-time edits you mention is just one of many potential benefits.

But problems include:
* Inconsistent sets of assets due to live updates
* live-update of assets you don't want changed, or modifications by other people
* Difficulty in syncing to a previous version
* Extreme difficulty in reconstructing a specific build in the past

It is almost impossible to sync to a specific changelist when you must synchronize that with one or more databases at that same time; did the build server pull from the database right before, or right after? If the desired changelist was at 12:57:08 and a database change was also at 12:57:08, do you include the database change or not?

It is almost as easy to just use FindFirstChangeNotification() to monitor your directory tree. It can be a tiny bit more work to reload assets when they change, but most of the work would be very similar. Watching for changed files gives the benefits of live updates without most of the problems listed above. It can still have inconsistent changes potentially applied, but that is entirely within your control instead of outside it.
1

Share this post


Link to post
Share on other sites
[quote]you could look into NoSQL datastores[/quote]

Of course, they are web scale....

[quote]What I envision is in essense a web-interface that empowers anybody to submit assets, manages dependecies between the assets, prevent complicated machine setups.[/quote]

Look how git works. Not git the application, git simply does not work for non-diffable/binary content, but the DVCS design and how change tracking log works.

Each change is a node in graph. When user decides to checkout a specific revision, they select the node, which contains backlinks to older content.

When updating, user walks back through these changes, building the new items incrementally through diffs. Caching here is implicit - any blob available to client is either fully known or can be built from change log.



For assets, git fails. Difficulty in change management of binary assets makes such operations unfeasible. So instead of using raw diffs, only adopt the change tracking mechanism, not the diffs.

When a client requests repository 0x41a4fcb, get that. Examine to see which resources it contains, follow backward via previous patch. And so on, until you have all the resources. Instead of rebuilding from diffs, request complete latest resources from repository.

For custom solution, the above can be done server-side to avoid latency of requests. Tell server: My repository is 0x4132, give me latest (or 0xfa81). Server walks the change graph, returns a single file containing list of all resources which need to be downloaded.


Since changes are incremental, the graph is immutable. If you have resource 0x4132, it will never be changed again - any user that makes any change will get a new ID. Such approach naturally maps to REST-ful design. This also solves the caching problems, since each resource is immutable, it can be deleted from cache at any time and retrieved later.


To avoid solving the wrong problem, look how to extract change information from existing repository. SVN is fine, probably better that DVCS for binary assets. Perforce is usually preferred. Then format those change logs so they can be retrieved via web service. Either way, you still need a reliable versioning system in the back - just keeping track of change graph is not enough - hard problems lie elsewhere, so don't try reinventing those.


Performance of such system is hard to guess. Git et al fail due to content parsing and ID generation, but other parts scale relatively well, most of the time. Above is merely an interface to existing service.

----

Monitoring for file system changes (as mentioned above) is a frequently used solution. Instead of inventing some complex mechanism, you just use whatever versioning system you have on top of file system.

Far from perfect or "pure", but works in practice.
-2

Share this post


Link to post
Share on other sites
Thanks for the much desired information. The change tracking system of git looks like a promising candidate indeed, giving both the abillity to pin a session to a specific version as well as well as live-updates. Technological complications seem to be one of the more pressing issues of centralized storage rather than performance metrics that which I feared. That gives me confidence the idea could very well be practical and reasons to invest in a implementation.
0

Share this post


Link to post
Share on other sites
[quote name='Antheus' timestamp='1329146845' post='4912626']

Of course, they are web scale....
[/quote]

Are you mocking me, sir?

[quote]
Look how git works. Not git the application, git simply does not work for non-diffable/binary content, but the DVCS design and how change tracking log works.
[/quote]

Yes, the history and change tracking can be used. Any other VCS is going to have the same problem with trying to diff arbitrary binary formats, no?

[quote]
For assets, git fails. Difficulty in change management of binary assets makes such operations unfeasible. So instead of using raw diffs, only adopt the change tracking mechanism, not the diffs.
[/quote]

See above.

[quote]
SVN is fine, probably better that DVCS for binary assets.
[/quote]

Why? Please explain.

[quote]
Perforce is usually preferred.
[/quote]

Why? I haven't used Perforce, to be honest, but I would stay away from it unless absolutely necessary simply because of the licensing costs.
What would be the technical advantage of using it over, for example, git?

[quote]
Instead of inventing some complex mechanism, you just use whatever versioning system you have on top of file system.
[/quote]

And which file system would that be?
0

Share this post


Link to post
Share on other sites
[quote]Why? Please explain.[/quote]

DVCS relies on diffs for efficiency, source code specifically is a natural fit. Binary assets don't diff naturally, resulting in considerable overhead, both for resources as well as management. As repository grows, SVN scales better from user perspective.

There is a special git version/patch for storing blobs, but if using a workaround, might as well go all the way, since using multiple repositories loses the uniform change management advantage.

For source, operations like bisect and three-way merge make sense. For binary assets they don't. If two people edit same PSD, merging doesn't make conceptual sense in a way one can merge edits to textual content.

[quote]Yes, the history and change tracking can be used. Any other VCS is going to have the same problem with trying to diff arbitrary binary formats, no?[/quote]

Synchronizing centralized VCS doesn't require bringing in entire change history, one just gets the snapshot, partial checkouts are also used.

[quote]Why? I haven't used Perforce, to be honest, but I would stay away from it unless absolutely necessary simply because of the licensing costs.
What would be the technical advantage of using it over, for example, git?[/quote]

Proven track record.

SVN was mentioned as free alternative.

[quote]And which file system would that be?[/quote]

Whatever works or comes with OS. NTFS, ext, ...
0

Share this post


Link to post
Share on other sites
[quote name='Antheus' timestamp='1329167738' post='4912741']

DVCS relies on diffs for efficiency, source code specifically is a natural fit. Binary assets don't diff naturally, resulting in considerable overhead, both for resources as well as management. As repository grows, SVN scales better from user perspective.
[/quote]

It scales better in what way? 'User perspective' is vague.

[quote]
There is a special git version/patch for storing blobs, but if using a workaround, might as well go all the way, since using multiple repositories loses the uniform change management advantage.
[/quote]

What 'special git version/patch'? Git can store binary blobs out of the box; there's no patch required.
And what's this about multiple repositories? That's the nature of DVCS. There's no central repo. You can, however, designate a 'primary' and treat secondaries as 'forks' (think 'GitHub').

[quote]
For source, operations like bisect and three-way merge make sense. For binary assets they don't.
[/quote]

You're absolutely right. SVN has the same problem. So what's your point?

[quote]
Synchronizing centralized VCS doesn't require bringing in entire change history, one just gets the snapshot, partial checkouts are also used.
[/quote]

And what's wrong with bringing in the entire change history? The compression in git is amazing and if you push upstream frequently, your updates are small.

[quote]
Proven track record.
[/quote]

Proven track record of what? Again, vague.

[quote]
Whatever works or comes with OS. NTFS, ext, ...
[/quote]

Oh, I thought you were talking about a versioning file system: http://en.wikipedia.org/wiki/Versioning_file_system. You were just talking about plain old file systems. So tell me, what does that file system have to with the OP's problem?
-1

Share this post


Link to post
Share on other sites
[quote]Proven track record of what? Again, vague.[/quote]

Of use in and out of industry.

[quote]The compression in git is amazing and if you push upstream frequently, your updates are small.[/quote]

How well is it working for raw video assets, where each file is 500MB+? And raw images.

[quote]So tell me, what does that file system have to with the OP's problem?[/quote]

In your running app, you hook to directory change notification. Whenever any of the files changes, reload them.

To update a running app, use regular VCS checkout, working on top of plain old file system. Presto - versioned on-the-fly updates, exactly what OP asked, without reinventing entire VCS and everything else.
0

Share this post


Link to post
Share on other sites
[quote]
Of use in and out of industry.
[/quote]
That's not a "technical advantage"; you have said nothing to back up your statement.

[quote]
How well is it working for raw video assets, where each file is 500MB+? And raw images.
[/quote]
In what (real world) use case would one do such a thing?
0

Share this post


Link to post
Share on other sites
[quote name='thok' timestamp='1329171797' post='4912757']And what's wrong with bringing in the entire change history? The compression in git is amazing and if you push upstream frequently, your updates are small.[/quote]A [i]single revision[/i] for a modest console game's source assets repository ([i]which ends up on a DVD-sized disk after compilation[/i]) could be hundreds of gigabytes. The entire change history could be several dozen terabytes.
If you've got 50 people on the project, it's obviously cheaper to have a central storage server with several terabytes available, rather than to require every developer to have a several-terrabyte RAID setup themselves. Plus when a new developer joins the project, you don't want be doing a several terabyte download via [font=courier new,courier,monospace]git://[/font] etc.
At these large scales, DVCS for binary assets simply falls apart, [i]so far[/i]. I really do hope this situation is rectified, because git is great for code or at smaller scales of binary data. Large projects would really need a hybrid git, where it's mostly DVCS, but certain directories could be centralised.[quote name='thok' timestamp='1329177653' post='4912795']In what (real world) use case would one do such a thing?[/quote]On a project that requires video or image files? You always need to store the original (non lossy-compressed) files in your source repo.
[quote name='thok' timestamp='1329156508' post='4912670']Why [Perforce is usually preferred]? I haven't used Perforce, to be honest, but I would stay away from it unless absolutely necessary simply because of the licensing costs.
What would be the technical advantage of using it over, for example, git?[/quote]In terms of assets, it's superior over git because of the DCVS issues already mentioned. The real question then is, why is it superior over SVN ([i]seeing as SVN is free[/i])? For a small project with no money to throw around on spurious licensing, it probably doesn't matter. On the larger scale though, it's simply much more efficient than SVN ([i]e.g. when managing, branching, downloading hundreds of gigs of binary[/i]).

[edit]Time travelling quote:[quote]but I think you missed some specific points in OPs question. We're not talking about version control for development artifacts. We're talking about this:[/quote]I wasn't trying to respond to the OP sorry, and I'm pretty sure the questions of yours that I answered were off-topic, so I was just trying to quash this off-topic trolling between you and Antheus about how useful Git is for binary data.
1

Share this post


Link to post
Share on other sites
[quote name='Hodgman' timestamp='1329177929' post='4912797']
[quote name='thok' timestamp='1329171797' post='4912757']And what's wrong with bringing in the entire change history? The compression in git is amazing and if you push upstream frequently, your updates are small.[/quote]A [i]single revision[/i] for a modest console game's source assets repository ([i]which ends up on a DVD-sized disk after compilation[/i]) could be hundreds of gigabytes. The entire change history could be several terabytes.
If you've got 50 people on the project, it's obviously cheaper to have a central storage server with several terabytes available, rather than to require every developer to have a several-terrabyte RAID setup themselves. Plus when a new developer joins the project, you don't want be doing a several terabyte download via [font=courier new,courier,monospace]git://[/font] etc.[quote name='thok' timestamp='1329177653' post='4912795']In what (real world) use case would one do such a thing?[/quote]On a project that requires video or image files? You always need to store the original (non lossy-compressed) files in your source repo.
[/quote]

Point taken, but I think you missed some specific points in OPs question. We're not talking about version control for development artifacts. We're talking about this:
[quote]
[color=#282828][font=helvetica, arial, verdana, tahoma, sans-serif][size=3][left]One of my main concerns is how data stored in the database should be used by the game runtime. I try to strive for 'realtime-editing' so what I am considering is a network interface that simply lets you do restful operations to query or modify the data and has notification api. This way any client could at any given time query for an up-to-date asset given that it is in the repository.[/left][/size][/font][/color]
[/quote]

As a side note, if _all_ of the dev artifacts (video, sound, code, etc.) are being stored in a single repo (without any apparent thought to compartmentalization), that's just insane. Whoever does this should be beaten. Yes, obviously it would be insane to check out _everything_ if you want to modify a single of code. That's what superprojects/submodules are for: http://en.wikibooks.org/wiki/Git/Submodules_and_Superprojects
It's effectively the same thing as a partial checkout.
0

Share this post


Link to post
Share on other sites
[quote]Yes, obviously it would be insane to check out _everything_ if you want to modify a single of code.[/quote]

Which is nice in theory, but in practice you often need a whole thing. Even better, you want the whole thing.

Here's a scenario: An asset is crashing your runtime. Not knowing whether it's the code, the asset or the pipeline, you try a few things. Since problems are detected during QA, you need to replicate the failing version. Build server may help, but ultimately you need to test individual pieces to pinpoint it.

To go even faster, write a test, bisect to see when it started failing, then fix the change. At minimum, the build server needs to do full checkout and possibly deployment.

Single repository across all people/teams/company is nice, but doesn't always work.

Git, as nice as it is, is not the first, last and only word when it comes to VCS.
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0