• Content count

  • Joined

  • Last visited

Community Reputation

1383 Excellent

About Ysaneya

  • Rank
  1. Patch screenshots

    Patch has been released a few weeks ago. This patch introduced improved cockpits with lighting & shadowing, the hauler NPC ship ( static at the moment, but once the game becomes more mature there will be A.I. ), military bases and factories ( currently placeholders: undetailed and untextured ) on planets. See the attached screenshots. One of our artists, Dan Hutchings, is making a first pass on the space station modules. In Infinity: Battlescape, we designed our space stations, bases and factories to be modular. This means that we model & texture independant modules, which can get attached together in various configuration layouts. Here's one of such layouts for a space station: But more layouts / shapes are possible to create interesting combinations: Meanwhile, I've been working on refactoring the client / server ( most of the code was quickly set up for our Kickstarter campaign and still suffers from architecturing issues; for example, projectiles hit detection is still client authoritative, which is a big no-no ) and improving networking latency, bandwidth and interpolation. It is expected that this work will take at least a month, if not more, but during this refactoring I'll also add a bunch of new gameplay elements ( teams, resources/credits generation etc.. ). Work has started on the user interface / HUD too but I'll leave that for a future post. Here are pics of the cargo ship ( hauler ):
  2. Image compositing

    The first step to understanding PBR would be to read more about BRDFs and what the different terms of the equations mean. Those are the terms that come from the textures like albedo, roughness, etc.. If you don't understand what the BRDF does, you do not understand PBR.
  3. A retrospective on the Infinity project

    I'm not, we're 2 programmers and 3 artists and we might hire a 3rd programmer next year if budget allows.
  4. A retrospective on the Infinity project

      Thanks. Unfortunately doing a quick glance at the past entries, all the image links are dead and I have no easy way ( nor time ) to fix them :(   I'll be posting every week our quick progress reports and, less frequently, more advanced posts about various topics.
  5. A retrospective on the Infinity project

      I'm not aware of any particular problem. It's fast and responsive here.   @everybody Thanks for the welcome back :)
  6. Hey Everybody, long time no see, Ysaneya here ! I haven't posted in the past 6 years if I count well. Most of you probably don't remember me, but the few of you who do should remember the Infinity project and how it all started back in 2005. It started with a dream, one made of stars and full of procedurally-generated planets to visit. At the time, Elite was a long forgotten franchise and nobody was working on a procedural universe. I started to work in my spare time on a MMO project called Infinity. 2005 - 2010: procedural dreams In the first years, I started to research procedural planets generation. I also developped an entire engine ( nowadays known as the I-Novae Engine ) to support all features I'd need for the Infinity project. Including: A flexible scene-graph A 3D renderer supporting all the latest-gen features and shaders ( shadow mapping, motion blur, HDR, dynamic lighting.. the usual list.. ) A physics engine ( I settled on ODE ) An audio engine ( OpenAL ) A network engine ( based on UDP ) All the procedural planetary & universe generation technology In 2007 I released a small free game, simply named the "Infinity Combat Prototype". The goal for that game was to integrate all the engine into a game to validate that all the components were working together, and that a game ( some newtonian multplayer combat in arenas in space ) could be produced. The idea was that it'd be the first step that would eventually lead to the whole MMO. Unfortunately, it's pretty much at this point that I started to get "lost" into the ambition of the project. I had created the concept of "community contributions" where wannabe-artists could submit artwork, 3D models & textures to be used in the game, but it quickly took a dozen hours a week to review all this work and to validate/reject it, keeping in mind that 95% of it was at the indy level at best. I was the only programmer on the team, and so progress started to slow down tremendously. We entered into a vicious circle where as months were passing, the cool brand new technology was getting deprecated / looking obsolete, and catching up took months for a single feature. That was the time were I replaced the old fashioned renderer by a deferred renderer, implemented dynamic lighting and shadow mapping and all sorts of visually cool stuff.. but meanwhile, gameplay progress was at a standpoint. I spent some time working on the client/server architecture and databases, but nothing too fancy, and definitely not to the point it could be used for a full fledged MMO. By 2010 it became crystal clear that as the sole programmer of the project, even using procedural technology and an artists community to alleviate the content generation problem, I couldn't keep up. A few programmers offered their help but clearly weren't up to the task, or gave up very quickly after a few months. If you've been an indy relying on external help by volunteers to work on your project, that should ring a bell. But in early 2010, I met Keith Newton, an ex-developer from Epic Games who worked on the Unreal Engine. He offered to set up an actual company, review our strategoy and approach the problem from a professional & business perspective. I was about to give up on the project at that time, so naturally, I listened. 2010 - 2012: Infancy of I-Novae Studios We formed the company I-Novae Studios, LLC, in early 2010, and started to look for investors that could be interested in the technology. Or companies interested in doing partnerships or licensing. Unfortunately it was bad timing and we didn't realize that immediately. If you recall, this was right after the economic crisis of 2008. All the people we talked to were very interested in the tech, but none were ready to risk their money in a small company with no revenue. We had a few serious opportunities during these year, but for various reasons nothing ever came out of it. Another problem was that this period was the boom of the mobile market, and most companies we talked to were more interested in doing mobile stuff than, sic, a PC game. During these years we also revamped our technology from the grounds up to modernize it. We switched to physical-based rendering ( PBR ) at this time, implemented a powerful node-based material system, added an editor ( one thing I simply never worked on pre-2010, due to lack of resources ) and much more. Keith worked approximately 2 years and a half full time, out of his own savings, to mature the tech and look for business opportunities. Meanwhile, our other artists and I were still working part time. On the game side, unfortunately things still weren't looking great. It was our strategy to focus back on the technology and put Infinity on hold. We came to the conclusion that we'd probably need millions to realistically have a shot at producing a MMO at a decent quality and in good conditions, and that it couldn't be our first project as a company. In 2012, Kickstarter started to become a popular thing. It was at this time that we started to play with the idea of doing a Kickstarter for a less ambitious project, but still including our key features: multiplayer components and procedural planetary generation. That was how Infinity: Battlescape was born. 2013 - 2015: Kickstarter, full steam ahead It took us more than 2 years to prepare our Kickstarter. Yup. At this point Keith was back to working part time, but I left my job to dedicate myself to the Kickstarter, working full time out of my own savings on it. To produce the Kickstarter we needed a lot of new content, never shown before, and at near-professionel quality. This included a ship with a fully textured PBR cockpit, mutliple smaller ships/props, asteroids, a gigantic space station, multiple planetary texture packs and a larger cargo ship. We decided pretty early to generate the Kickstarter video in engine, to demonstrate our proprietary technology. It'd show seamless take offs from a planet, passing through an asteroid field, flying to a massive space station that comes under attack, with lots of pew-pew, explosions and particle effects. IIRC we iterated over 80 times on this video during the year before the Kickstarter. It's still online, and you can watch it here: Meanwhile, I was also working on a real-time "concept demo" of Infinity: Battlescape. Our original plan was to send the demo to the media for maximum exposure. It took around 8 months to develop this prototype. It was fully playable, multiplayer, including the content generated by our artists in the Kickstarter trailer. The player could fly seamlessly between a few planets/moons, in space, around asteroids or dock in a space station. Fights were also possible, but there never was more than a handful of players on the server, so we could never demonstrate one of the keypoints of the gameplay: massive space battles involving hundreds of players. In October 2015, we launched our Kickstarter. It was a success and we gathered more than 6000 backers and $330,000, a little above the $300,000 we were asking for the game. It was one of the top 20 most successful video games Kickstarters of 2015. Our media campaign was a disapointment and we received very little exposure from the mass media. I understandably blame our "vaporware" history. The social media campaign however was a success, particularly thanks to a few popular streamers or twitters that brought exposure on us, and by Chris Roberts from Star Citizen who did a shout-out on his website to help us. But as much as we're happy to -finally- have a budget to work with, it was only the beginning.. 2016+: Infinity Battlescape We started full development in February 2016 after a few months of underestimated post-KS delays ( sorting out legal stuff, proper contracts with salaries for our artists, and figuring out who was staying and who was leaving ). Since then, we've focused on game design, producing placeholders for the game prototype and improving our technology. We're still working on adding proper multithreading to the engine, moving to modern Entity-Componeny-System ( ECS ), and figuring out what to do with Vulkan and/or Directx 12. Meanwhile we're also working on networking improvements and a more robust client/server architecture. The game is scheduled for release in end-2017. All the pictures in this article are coming from our current pre-alpha. https://www.inovaestudios.com/
  7. Your GLSL code shows that you use FP32, not FP64, so those artifacts are expected at LODs >= 15.   One solution is to use FP64 ( dvec3 ), however not all GPUs support all the operations you need for procedural noise.   In the I-Novae engine, I ended up using an hybrid solution: FP64 shader emulation. You can read more on it on this page: https://thasler.com/blog/?p=93   For performance reasons many operations still work in FP32 ( only the inputs to the procedural function, ie. the planet position, is using double emulation ), but that alone significantly improves the precision on the lowest LOD levels. In my tests, I can go up to level 18 with that "trick".   Y.
  8. Tech Demo Video 2010

    Quote:Original post by remigius Regarding the depth issues, did you abandon the logarithmic approach or am I misunderstanding your explaination? You seemed quite fond of it back then and as I understand Cameni's post even with those tricks a plain floating point zbuffer still has worse precsision than a fixed point logarithmic one. I'm no expert on the matter, so at the risk of asking a dumb question I was wondering if there are any new performance issues or artifacts why you don't use the logarithmic approach (if in fact you don't)? It's weird really. In theory, it should work well. I even experimented the logarithmic trick in a pixel shader in my ASEToBin model viewer a few months ago, and the results were as expected, very good. So I switched to the technique that cameni and Humus mentionned, which is very simple: - use a 32-bits floating point Z Buffer format - change the depth test to GL_GREATER instead of GL_LOWER - swap the near and far planes in the projection matrix Cameni explained in his journal that it'd give enough resolution for a whole planet. The problem is that in my tests, the clouds surface is horribly z-fighting with the ground surface. The distance between those 2 surfaces is around 10 Km, with a viewer at around 10000 Km from the planet and a planetary radius of around 6350 Km. Theory says it should work, so it must be a bug in my implementation. But I don't get where it could be, as there aren't that many places where it could have gone wrong, so what I'll probably end up doing is creating a small standalone prototype to render two spheres in different colors with the above parameters, and verify if they Z-fight or not..
  9. Tech Demo Video 2010

    It's been many years since the release of the last video showcasing the seamless planetary engine, so I'm happy to release this new video. This is actually a video of the game client, but since there's little gameplay in it, I decided to label it as a "tech demo". It demonstrates an Earth-like planet with a ring, seamless transitions, a little spaceship ( the "Hornet" for those who remember ), a space station and a couple of new effects. You can view it in the videos section of the gallery. Making-of the video Before I get into details of what's actually shown in the video, a few words about the making-of the video itself, which took more time than expected. What a pain ! First of all, it took many hours to record the video, as each time I forgot to show something. In one case, the framerate was really low and the heavy stress required to dump a 1280x720 HQ uncompressed video to the disk. The raw dataset is around 10 GB for 14 minutes of footage. 14 minutes ? Yep, that video is pretty long. Quite boring too, which is to be expected since there's no action in it. But I hope you'll still find it interesting. Once the video was recorded, I started the compression process. My initial goal was to upload a HQ version to YouTube and a .FLV for the video player embedded on the website. The second was quite easily done, but the quality after compression was pretty low. The bitrate is capped to 3600 kbps for some reason, and I didn't find a way to increase it. I suspect it's set to this value because it's the standard with flash videos. I also wanted to upload a HQ version to YouTube to save bandwidth on the main site, but so far it's been disappointing. I tried many times, each time YouTube refused to recognize the codec I used for the video ( surprisingly, H264 isn't supported ). After a few attempts I finally found one that YouTube accepted, only to discover that the video was then rejected due to its length: YouTube has a policy to not accept videos that are more than 10 minutes long. What a waste of time. So instead I uploaded it to Dailymotion , but it's very low-res and blurry, which I cannot understand since the original resolution is 1280x720; maybe it needs many hours to post-processing, I don't know. There's also now a two parts HQ video uploaded to youtube: part 1 and part 2 . If you're interested in watching it, make sure you switch to full screen :) Content of the video The video is basically split in 3 parts: 1. Demonstration of a space station, modelled by WhiteDwarf and using textures from SpAce and Zidane888. Also shows a cockpit made by Zidane888 ( I'll come back on that very soon ) and the Hornet ( textured by Altfuture ). 2. Planetary approach and visit of the ring. Similar to what's already been demonstrated in 2007. 3. Seamless planetary landings. Cockpit I've been very hesitant in including the cockpit in the video, simply because of the exceptations it could potentially generate. So you must understand that it's an experiment, and in no way guarantees that cockpits will be present for all ships in the game at release time. It's still a very nice feature, especially with the free look around. You will notice that you can still see the hull of your ship outside the canopy, which is excellent for immersion. Note that the cockpit isn't functionnal, so if we indeed integrate it to the game one day, I would like that all instruments display functionnal informations, that buttons light on/off, etc.. Background The backgrounds you see in the video ( starfield, nebula ) are dynamically generated and cached into a cube map. This means that if you were located in a different area of the galaxy, the background would be dynamically refreshed and show the galaxy from the correct point of view. Each star/dot is a star system that will be explorable in game. In the video, as I fly to the asteroids ring, you will see that I click on a couple stars to show their information. The spectral class is in brackets, and follows is the star's name. At the moment, star names are using a unique code which is based on the star location in the galaxy. It is a triplet formed of lower/upper case characters and numbers, like q7Z-aH2-85n. This is the shortest representation that I could find that would uniquely identify a star. This name is then followed by the distance, in light-years ( "ly" ). I still have to post a dev-journal about the procedural rendering of the galaxy on the client side, in which I'll come back on all the problems I've had, especially performance related. Planet I'm not totally happy with the look of the planet, so it is likely that in the future, I will at least do one more update of the planetary engine. There are various precision artifacts at ground level, as the heightmaps are generated on the GPU in a pixel shader ( so are limited to 32-bits of floating point precision ). I've also been forced to disable the clouds, which totally sucks as it totally changes the look & feel of a planet seen from space. The reason for that is that I implemented the Z-Buffer precision enchancement trick that I described in a previous dev journal, and it doesn't totally work as expected. With clouds, the clouds surface is horribly Z-fighting with the ground surface, which wasn't acceptable for a public video. At the moment, I use a 32-bits floating point Z-Buffer, reverse the depth test and swap the near/far clipping planes, which is supposed to maximize Z precision.. but something must have gone wrong in my implementation, as I see no difference with a standard 24-bits fixed point Z Buffer. The terrain surface still lacks details ( vegetation, rocks, etc.. ). I still have to implement a good instancing system, along with an impostor system, to get an acceptable performance while maintening a high density of ground features. Look & Feel Don't think for one second that the "look & feel" of the camera and ship behavior is definitive in this video. I'm pretty happy with the internal view and the cockpit look, but the third-person camera still needs a lot of work. It theorically uses a non-rigid system, unlike the ICP, but it still needs a lot of improvements. Effects As you may notice, the ship's thrusters correctly fire depending on the forces acting on the ship, and the desired accelerations. Interestingly, at one given point in time, almost all thrusters are firing, but for different reasons. First, the thrusters that are facing the planet are continuously firing to counter-act the gravity. It is possible to power down the ship ( as seen at the end of the video ), in which case the thrusters stop to work. Secondly, many thrusters are firing to artifically simulate the drag generated by the auto-compensation of inertia. For example when you rotate your ship to the right, if you stop moving the mouse the rotation will stop after a while. This is done by firing all the thrusters that would generate a rotation to the left. Of course, some parameters must be fined tuned. When the ship enters the atmosphere at a high velocity, there's a friction/burning effect done in shaders. It still lacks smoke particles and trails. This video will also give you a first idea of how long it takes to land or take off from a planet. The dimensions and scales are realistic. Speed is limited at ground level for technical reasons, as higher speeds would make the procedural algorithms lag too much behind, generating unacceptable popping. At ground level, I believe you can fly at modern airplanes speeds. A consequence of this system is that if you want to fly to a far location on the planet, you first have to fly to low space orbit, then land again around your destination point.
  10. The possible future of games

    For a technology that claims to have unlimited details, I certainly believe that the end result looks crap. Now, I understand those guys are no artists, and that's fine, the colors are horribly chosen and the scene is repetitive, but I don't mind. However if you look carefully at the video when the camera gets close to some objects, like plants, you'll see that the "voxel" resolution is very low. It'd be equivalent to something like a texture of 128x128 per square meter. While nowadays, any half decent 3D game will have ground and walls with 2048x2048 textures. Which brings me to the question: if you really have the technology to display unlimited details, why don't you demonstrate it in your videos ? Like, at least having an equivalent virtual resolution than what a 3D game provides.. I don't doubt that the technology works, but the "unlimited" part is pure marketing bullshit. That, and it's all static. Show me the same scene with destructible walls, tons of walking characters and a high resolution, then I'll be impressed. Y.
  11. If your market are small indy developers that release small plat-form/puzzle games, there could be an interest for it. However I agree with other posters that to be more generally useful, it needs to support GL 2. No offense, but Quake 3 and Return to Castle Wolfenstein are almost a decade old. You're not proving anything by supporting them. By the way: Quote:TitaniumGL is also capable, to render your game with multicore cpu rendering Can you elaborate ? How can it do that ? Have you done some benchmarks ? Y.
  12. Quote:Original post by bluntman Ah, exactly who I was thinking of when I said "someone" :). Thanks for the reply! So on your planets, what detail level do you actually go to? Have you made them 100% real world scale, or just close enough to look right? It is 100% real world scale, and my test planet is 6350 Km of radius. I have found that I'm starting to have precision issues at depth 13-14, but those are still acceptable. It gets much worse at level 15-16, and totally unacceptable over 16. I need depth 16-17 to get down to the meter resolution on the ground surface. Originally I generated the procedural geometry (mesh vertices) on the CPU so I could use doubles, and had no particular precision issues. Quote:Original post by bluntman I remember you mentioned somewhere that you managed to move your noise generation onto the GPU (or maybe it was that you were going to), did you manage, and if so, did you manage to still get around the precision problems (as obviously there is no double support in 99% of cards)? Correct, that was me. One thing I was disatisfied with in the previous version was that I only generated the mesh on the cpu, and I wanted to generate normal maps too, which was too slow on the CPU (imagine generating a 512x512 unique texture per chunk, each texel requiring 40+ octaves of noise). So I implemented the procedural generation on the GPU, and now use it both for geometry (with a read-back to the CPU) and for normal maps generation. To this day, I still have the precision issues, and as it works on the GPU I cannot use doubles (yet). I actually plan on moving back the mesh generation on the CPU, just to be able to reuse doubles and fix "cracks" in the terrain due to that lack of precision; and keep the normal maps on the GPU with the limited precision, but also limit the maximum depth the normal maps can go to, probably around depth 14-15. Not an ideal solution but it's the best I can think of at the moment. Quote:Original post by bluntman So I converted my entire algorithm over to using double for everything (iteratively, as each successive change didn't fix the quantization problems), and I still have the same problems. I'm thinking it may have to do with using a local transform at each chunk-lod root. I have heard that double precision numbers have the capability to represent coordinates accurate to within a cm in billions of km. That's correct, doubles have enough precision for a planetary engine. If I remember well, they were good enough to have millimeter accuracy at a distance of 100 AUs (1 AU = distance Sun-Earth). If you're using doubles and still have precision issues, you must have a bug in your code somewhere, like a cast to float that you forgot somewhere.. Y.
  13. How screwed am I?

    My advice would be to take another year (assuming you can financially afford it) and complete a good demo that you will show to employers in a year. After all, at this point, 4 or 5 years won't make a difference to an employer, but having a working demo will. Y.
  14. I've run into this exact problem, and spent a lot of time trying to find a solution with moderate success. One thing that helps a bit is to change the input to fp64 (doubles) but keep as much of the algorithm as possible as fp32 for performance reasons. If you're using improved Perlin noise, it looks like this: TFloat CPerlinNoise::improvedNoise3DD(const SVec3DD& xyz) { /// improved Perlin noise, standard version const TInt xtr = MDoubleToInt(xyz.x - 0.5); const TInt ytr = MDoubleToInt(xyz.y - 0.5); const TInt ztr = MDoubleToInt(xyz.z - 0.5); const TInt X = xtr & 255; const TInt Y = ytr & 255; const TInt Z = ztr & 255; const TFloat x = (TFloat)(xyz.x - xtr); const TFloat y = (TFloat)(xyz.y - ytr); const TFloat z = (TFloat)(xyz.z - ztr); const TFloat u = _smoothstep5F(x); const TFloat v = _smoothstep5F(y); const TFloat w = _smoothstep5F(z); const TInt A = ms_impPerm[X] + Y; const TInt AA = ms_impPerm[A] + Z; const TInt AB = ms_impPerm[A + 1] + Z; const TInt B = ms_impPerm[X + 1] + Y; const TInt BA = ms_impPerm[B] + Z; const TInt BB = ms_impPerm[B + 1] + Z; return(_lerpF(w, _lerpF(v, _lerpF(u, _gradF(ms_impPerm[AA], x, y, z), _gradF(ms_impPerm[BA], x - 1, y, z)), _lerpF(u, _gradF(ms_impPerm[AB], x, y - 1, z), _gradF(ms_impPerm[BB], x - 1, y - 1, z))), _lerpF(v, _lerpF(u, _gradF(ms_impPerm[AA + 1], x, y, z - 1), _gradF(ms_impPerm[BA + 1], x - 1, y, z - 1)), _lerpF(u, _gradF(ms_impPerm[AB + 1], x, y - 1, z - 1), _gradF(ms_impPerm[BB + 1], x - 1, y - 1, z - 1))))); } Y.