Our New Game: I Am Dolphin

Published October 08, 2014
Advertisement
After an incredibly long time of quiet development, our new game, I Am Dolphin, will be available this Thursday, October 9th, on the Apple/iOS App Store. This post will be discussing the background and the game itself; I'm planning to post more technical information about the game and development in the future. This depends somewhat on people reading and commenting - tell me what you want to know about the work and I'm happy to answer as much as I can.

LOGO_final_512.png



For those of you who may not have followed my career path over time: A close friend and I have spent quite a few years doing R&D with purely physically driven animation. There's plenty of work out there on the subject; ours is not based on any of it and takes a completely different approach. About three years ago, we met a neurologist at the Johns Hopkins Hospital who helped us set up a small research group at Hopkins to study biological motion and create a completely new simulation system from the ground up, based around neurological principles and hands-on study of dolphins at the National Aquarium in Baltimore. Unlike many other physical animation systems, including our own previous work, the new work allows the physical simulation to be controlled as a player character. We also developed a new custom in-house framework, called the Kata Engine, to make the simulation work possible.

One of the goals in developing this controllable simulation was to learn more about human motor control, and specifically to investigate how to apply this technology to recovery from motor impairments such as stroke. National Geographic was kind enough to write some great articles on our motivations and approach:

Virtual Dolphin On A Mission

John Krakauer's Stroke of Genius

Although the primary application of our work is medical and scientific, we've also spent our spare time to create a game company, Max And Haley LLC, and a purely entertainment focused version of the game. This is the version that will be publicly available in a scant few days.

M+Hlogo-600.png


Here is a review of the game by AppUnwrapper.[quote]
I got my hands on the beta version of the game, and it's incredibly impressive and addictive. I spent two hours playing right off the bat without even realizing it, and have put in quite a few more hours since. I just keep wanting to come back to it. iPhones and iPads are the perfect platform for the game, because they allow for close and personal, tactile controls via simple swipes across the screen.[/quote]
I now have three shipped titles to my name; I'd say this is the first one I'm really personally proud of. It's my firm belief that we've created something that is completely unique in the gaming world, without being a gimmick. Every creature is a complete physical simulation. The dolphins you control respond to your swipes, not by playing pre-computed animation sequences but by actually incorporating your inputs into the drive parameters of the underlying simulation. The end result is a game that represents actual motion control, not gesture-recognition based selection of pre-existing motions.

As I said at the beginning of the post, this is mostly a promotional announcement. However, this is meant to be a technical blog, not my promotional mouthpiece. I want to dig in a lot to the actual development and technical aspects of this game. There's a lot to talk about in the course of developing a game with a three person (2x coder, 1x artist) team, building a complete cross-platform engine from nothing, all in the backdrop of an academic research hospital environment. Then there's the actual development of the simulation, which included a lot of interaction with the dolphins, the trainers, and the Aquarium staff. We did a lot of filming (but no motion capture!) in the course of the development as well; I'm hoping to share some of that footage moving forward.

Here's a slightly older trailer - excuse the wrong launch date on this version. We decided to slip the release by two months after this was created - that's worth a story in itself. It is not fully representative of the final product, but our final media isn't quite ready. Note that everything you see in the trailer is real gameplay footage on the iPad. Every last fish, shark, and cetacean is a physical simulation with full AI.

16 likes 15 comments

Comments

jbadams

Looks interesting -- kinda wish I still had an Apple device to try it out! Sounds like you'll have a lot of material to cover in future!

October 08, 2014 04:51 AM
dsm1891

Will it have flipper support?

October 08, 2014 01:46 PM
Promit

Will it have flipper support?

I am not sure what you're asking. The dolphin's fins move on their own based on your input.

October 08, 2014 03:05 PM
Cornstalks

Will it have flipper support?

I am not sure what you're asking. The dolphin's fins move on their own based on your input.

I'm guessing he's talking about Flipper the TV show/movie.

I love the way the dolphin/fish move!
October 08, 2014 04:43 PM
TheChubu

Cutest dolphin ever.

October 09, 2014 12:42 AM
mikeman

It's not every day you see such innovation happening in the mobile game industry. Congratulations for an amazing

product!

October 09, 2014 10:48 AM
Sik_the_hedgehog

While I know you're talking mostly about the AI here, I can't help but be more interested on dolphins being able to eat sharks (was looking at another footage video). Which also made me realize that in Ecco the move used to eat fish and to attack sharks is the same. Which means that Ecco has been eating sharks all this time. *mindblown*

What do you think will eventually come out of the research? (game aside) I guess that you showed how it can affects AI in a game, but what about outside gaming as well?

October 09, 2014 07:17 PM
Sector0

Great job for creating those procedural animations.

October 09, 2014 10:17 PM
Geometrian

Gentle criticism: I'd like to see some better graphics--in particular some fake underwater caustics and some splashing on the surface. It doesn't look like you're taxing the GPU much. Regarding the animation, the characters look like they are far too maneuverable.

All said, knowing how difficult AI and realistic, physical animation is, I am very impressed. Good work!

October 10, 2014 05:30 AM
Promit

Gentle criticism: I'd like to see some better graphics--in particular some fake underwater caustics and some splashing on the surface. It doesn't look like you're taxing the GPU much. Regarding the animation, the characters look like they are far too maneuverable.

All said, knowing how difficult AI and realistic, physical animation is, I am very impressed. Good work!

GPU-wise, you have to keep in mind that we are driving the whole system at retina resolution and 60 fps. That means as much as 2048x1536 resolution, sixty times a second, on a mobile GPU that overheats in the first minute and clocks down to its actual sustainable speed. The iPad Air still has some GPU muscle to spare, as do the new iPhone 6 and 6+. The iPad Mini Retina and Phone 5 are running right at the line, and I have to shut off features on the 4, 3, and 2 versions of the iPads to hit that golden 60 fps number.

I am not a believer in making gorgeous screenshots at the cost of actual gameplay. I needed a visual that I could create on all devices, solidly at 60 fps, at very high resolutions, and maintain that way beyond an hour of play. While I haven't had the time to push all of the devices all of the way to the edge, these GPUs are all being pushed very hard. Which, by the way, shows up as unbelievably high power consumption by the game. We are a battery destroyer ;)

That said, caustics, splashes, and a variety of other special effects will most likely be added to the top spec devices in the near future. Splashes in particular were a dev time issue; I'm running a grid based water simulation in there, but it still looks crappy when I'm deforming it and I haven't had time to get the shaders and deformation right aesthetically. Caustics are behind a commented block of code because I'm not quite happy with their aesthetics or performance yet.

As far as maneuverability goes -

1) It's a game. More responsive and controllable wins over more realistic.

2) You might be surprised by what dolphins are actually capable of underwater. Several times, we saw the simulation do something that looked implausible, only to go visit the dolphins in the Aquarium and realize that they were actually doing it if you knew when and where to look.

October 10, 2014 05:42 AM
Geometrian

I see textured, diffuse shading on maybe up to 30 animated objects, maybe 100 particles, and in a few shots some distant terrain. A raspberry pi model A can do the graphics part of that at 30Hz in full HD using its crappy 4 core Broadcom GPU.

I was surprised then--nay amazed--to find that the 3rd- and 4th-generation iPads actually have comparable GPUs. The Air and even the iPhone 6 aren't much better. I was under the impression that mobile devices were maybe a decade back on the GPU curve. It's looking closer to two.

Under that information, I'm actually impressed you got this kind of graphics. For caustics, I was going to suggest some large textured quads--but you're almost certainly fillrate-bound at this resolution, which also explains your simple shading model. Updating animation geometry I imagine is also a significant challenge--especially since it looks like you used almost all your polygon budget on animated geometry. I'd be interested to hear about how you do skinning.

I likewise believe that interactivity trumps quality. Further, I find anything less than 60Hz unplayable. As above, I'm impressed you managed even that.

My research machine has 5,088 GPU cores. You're stuck with 4. Thank you for reminding me why I don't do mobile development; I retract my graphics criticism.

----

For maneuverability, I was referring specifically to one scene in which a killer whale turns around in a half second or so (around 1:04 in the trailer above). My impression was that they are larger, too. I'm also pretty sure they can't do double backflips when jumping.

But yes, certainly "More responsive and controllable wins over more realistic.".

-G

October 10, 2014 07:58 PM
Promit

I see textured, diffuse shading on maybe up to 30 animated objects, maybe 100 particles, and in a few shots some distant terrain. A raspberry pi model A can do the graphics part of that at 30Hz in full HD using its crappy 4 core Broadcom GPU.

I was surprised then--nay amazed--to find that the 3rd- and 4th-generation iPads actually have comparable GPUs. The Air and even the iPhone 6 aren't much better. I was under the impression that mobile devices were maybe a decade back on the GPU curve. It's looking closer to two.

Under that information, I'm actually impressed you got this kind of graphics. For caustics, I was going to suggest some large textured quads--but you're almost certainly fillrate-bound at this resolution, which also explains your simple shading model. Updating animation geometry I imagine is also a significant challenge--especially since it looks like you used almost all your polygon budget on animated geometry. I'd be interested to hear about how you do skinning.

I likewise believe that interactivity trumps quality. Further, I find anything less than 60Hz unplayable. As above, I'm impressed you managed even that.

My research machine has 5,088 GPU cores. You're stuck with 4. Thank you for reminding me why I don't do mobile development; I retract my graphics criticism.

Thanks - but you're missing somewhere between half and two thirds of the frame time smile.png I think I'll try and do a full breakdown of the graphics in another post. The water surface takes up at least half the frame time, maybe more on lower devices. It includes heavy tessellation (because I'm planning to do deformation/splashing down the line), three texture samples (one dependent), lots of ALU, and takes up a good chunk of the screen. The background is also not flat shaded; it's doing a dynamic gradient AND I'm adding film grain in on many devices to fix gradient banding. On top of that I have to fog-fade the creatures into the background, which means they're all computing the background (with grain!) too.

It's the subtle stuff that kills you sometimes. These things are minor and slip under conscious notice on screen, but they're key to the look.

Packing that stuff in at 60 fps, full resolution, was a challenge. You guessed correctly - the graphics is extremely fill rate and bandwidth bound. Every extra texture sample makes things a lot worse. The poly counts, despite being quite high, basically disappear against how much time is being spent on pixel processing. I don't do anything clever for skinning; we do a simple linear combination of bone matrices (which are derived from physics rigid bodies) in the shader.

(As far as the iPad GPU, remember that 2048x1536 @ 60 is 3x as many pixels per second compared to 1080p30, and the thing has to sink the GPU heat in a very narrow space.)

October 14, 2014 05:16 PM
Geometrian

I think I'll try and do a full breakdown of the graphics in another post.

I would definitely be interested in this. I'm going to ask a few more questions; perhaps you want to point them there instead of answering here?

The water surface takes up at least half the frame time, maybe more on lower devices. It includes heavy tessellation (because I'm planning to do deformation/splashing down the line),

This was surprising to me. I looked for water detail specifically, but thought you were probably using only two triangles and a normalmap because the intersections I saw on jumping looked flat. So, am I to understand that all that variation in lighting is representative of underlying geometry? I did notice the more complicated BRDF and the reflection (which I assume is a scaled inverse impostor?).

The background is also not flat shaded; it's doing a dynamic gradient AND I'm adding film grain in on many devices to fix gradient banding. On top of that I have to fog-fade the creatures into the background, which means they're all computing the background (with grain!) too.

My impression was maybe a depth fog hack, without special consideration of the background. I didn't really think about it much, but I'd've guessed the gradient came from a scalar with elevation.

I can see all the shading for non-creatures happening without a texture fetch (and besides, the compute for depth fog is pretty cheap), so I'd think this is only a bottleneck because it just covers a huge amount of pixels. It's a pity; one's instinct is to try deferred shading--but the fragment cost is coming mostly from overhead, not compute or memory fetch. A rare situation for HPC.

The poly counts, despite being quite high, basically disappear against how much time is being spent on pixel processing.

This is certainly the rule on commodity computer graphics cards, because compute is free compared to memory accesses. The vertex unit gets its data fed directly to it, but the fragment generally pulls most of its data indirectly from GPU memory. Even if it's coherent, it's still an issue for the memory controller. This is why thread processors have many register files to amortize accesses.

I'm . . . less convinced for fewer thread processors. The scheduler plays a role because it batches less, but the most significant reason is that memory accesses for fragment programs become both more coherent and less frequent. Graphics memory gets fetched into thread warps' caches, and the memory controller needs to service multiple thread warps. However, for a simple GPU--like apparently mobile GPUs exclusively are--there's effectively only one thread warp, which is the memory controller's only customer.

Probably more importantly, as fragment programs get cheaper, rasterization and vertex shading start being important. Assuming the area shaded remains contant, adding more vertices makes your application vertex-bound. My (software) rasterizer starts showing linear scaling with the number of vertices once pixels:vertices gets around 1000:1, for a pass-through fragment shader. Whatever the magic ratio is for your architecture, shading, and scene determines whether the render is vertex- or fragment-bound.

I don't know. I feel like vertex shading should be a significant cost--both because of the simpler architecture and because your fragment shader is so simple. But, at the same time, you have so many pixels to shade maybe both are dwarfed by rasterization.

Best,
-G

October 15, 2014 07:50 AM
GeneralJist

intriguing....

October 16, 2014 11:33 PM
Madhed

Very cool. Congrats Promit

October 19, 2014 01:03 PM
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Advertisement