Jump to content
  • Advertisement

tebriel

Member
  • Content Count

    490
  • Joined

  • Last visited

Community Reputation

904 Good

About tebriel

  • Rank
    Member

Personal Information

  • Interests
    Design
    Programming

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. The blending I'm not too worried about, I've done plenty of stuff like that before. The main trouble is figuring out a way to "iterate over the correct pixels in the correct order", if that makes sense. I'll have to iterate over two (possibly 3) differently oriented textures at the same time. For example, the bottom side's upper right corner borders the front's bottom left corner (in that diagram). The bottom side coordinate is (0, 2047), and front side pixel coordinate is (0, 0). The direction to travel along the edge might vary, though. In this case for the front side I need to increment the x value until I get to (2047, 0), and those pixels blend with the bottom's (0, 2047) to (2047, 2047); the latter coordinate being the bottom's bottom-right corner.   So that's one edge, but if you consider other edges you'll need to hit different coordinates. I'm not sure if in some cases I'll need to traverse backwards relative to one side due to the adjacent coordinates being inverted. It seems like that's not the case, so that'll make it much easier that I was thinking. To start I'll see if I can hard code each edge's relationship - if I can flatten all the border pixels for all edges with a brute force algorithm maybe that will be a good enough starting point to build on. I'm thinking that if I just focus on the edge pixels only for the first phase and get those to match their adjacent sides, I can apply a simple blur filter to the internal neighboring pixels - which would be much easier since I'd only have to worry about a single texture image at a time at that point.
  2. Whew, hard coding it and just doing the tedious work of manually mapping coordinates seems easier now! ha ha  Seems like this is a little off the beaten path and I'm not going to find any magic bullet to make this easier.  Maybe if I could map all the side's vertex coordinates (which match up with texture pixels) into some type of global coordinate system similar to a lat/long, that would make this easier - but that is probably easier said than done.
  3. I'll take a look at the paper, sometimes they can help spark ideas.  I only wish it were that easy, Hodgman!  It's not a cube map in that sense.  What I'm doing is normalizing the surface coordinates of this cube around (0,0,0) to turn it into a sphere.  The textures are used for... texturing (duh) but even more importantly, height maps.  The vertices on the surface of the sphere get moved away from the center based on the value in the height map to create basic "terrain".   Hard coding the edges... is my only current feasible idea, and probably the one I'll have to settle on. But the corners still seem to be problematic. If I crunch the numbers "by edge", will the edges being morphed later on screw up the previous iterations?  I'm not sure but it seems that way.   So I believe I'll need to do all 12 edges, avoiding X number of pixels near the corners, and then do 8 more iterations to deal with the problem for each corner.  The 12 edges should be relatively easy to smooth out, we can think of those as just smaller rectangular textures. But the corners seem a little mind bending, it's not a rectangular shape, so how in the heck am I going to iterate though that data?  Each corner will also have to respect the edge smoothing passes done before, otherwise I'll just be creating new seams between the corners and the edges. Hopefully I'm just making this more complicated than it is, but I'm not sure yet.
  4. tebriel

    Not motivated at all

    Planning and goals helps. Also: -I think it also helps to show people what you're working on--talk about the projects with others who are interested. It may vary by personality, but this can be a BIG help.  -Setting realistic goals is important. -By goals I don't mean time-based deadlines. Leave that crap for the day job and just focus on forward momentum. -Another idea is to have both difficult and easy goals. Focus on the hard problems when starting or when more motivated, and put those on hold and switch to easier ones at other times. Keeping some forward momentum (even if small) should help. -When you're stuck on a difficult goal, seek another perspective from people here. If you're procrastinating on something maybe it's not a lack of motivation, but an inability to "prime the pump" and get started on a solution. I'm experiencing this one myself right now. Ugh! -Perfection is the enemy of the good (and the completed)! If it's working, but not quite perfect yet and you're reaching a burnout point, accept it as good enough for now and move on to something more interesting! You can always come back to it later and refactor and improve. Besides, you never know when work at a later stage might impact something you just perfected earlier, negating all that effort you spent making it perfect. Not saying to throw things together like crap, but it's similar to premature work on optimization. It's not generally productive to polish and fine tune things until your overall product is feature complete. I experience some of the same problems, although in my case it's even more silly - I have a tendency to waste time "playing around" with and watching what I've built so far. Physics engines are entertaining things. Still haven't figured out a solution to that one yet. ^_^
  5. I need a fresh perspective or ideas to help make solving this problem a little easier.  Without going into overwhelming detail, basically I have 6 large textures, one for each side of a cube (let's say each is 2048 x 2048 pixels).  I want to build a utility which modifies the edges of the textures so that they're seamless, similar to how Gimp or any other image processor application would create a seamless tiling texture.   This would be easy if I were just matching up the edges of two textures at a time, but the fact that this is a CUBE means that the corners make things tricky, since each edge needs to be matched up to two different textures, which also have their own bordering edges that need to be made seamless.  Another complication is determining the correct pixels on each texture to read and modify.  Due to the different ways the coordinates are oriented on each texture, it's not as easy as it sounds.  I could probably throw lots of code at the problem and "hard code" things for each edge/corner, but I'm hoping to find a less work intensive and error prone solution.   Attachment is an effort to help visualize how the textures border one another.  The (x, y) in the corners contain texture pixel coordinates, mx is the maximum value (ex. index 2047).  The i & j can be read as x and y respectively (I also use i & j coordinates which range from -1 to 1 for some unrelated stuff, but we can ignore that bit here).  Red lines are just pointing out what sides are adjacent.   [attachment=36186:Cube_map.png] Is there some simple way to approach this problem that I'm failing to recognize?  Not sure if this is the best place to drop this one, so if a mod wants to move it somewhere better, go for it.
  6. I corrected my incorrect normal map implementation and... it seems to be working pretty well.  Maybe I'll still try another technique later on.   For some reason, even though I had read about doing this a number of times, the specific wording Hodgman used above was what made it click properly. Thanks! 
  7. @[member='ErnieDingo'] Yeah, the seam issue pictured is close to that, and in both dimensions.  The normals I'm sampling at lower LOD are also biased towards the flat ground normals farther away, since the triangles themselves are larger.  Actually doing it the way you describe sounds like a #5 option that I've been pondering, but I'm hesitating tossing out much of the work I've done so far.   The concept seems awesome though, so is most of the rendering work being done by the GPU in your case?  What I have performs fine, but I think my bottleneck is probably the vertex and normal buffers I'm passing to the GPU.  What I'd really like to do, if I can get the nerve to break away from my current methods, is try calculating everything in shaders and just sample my height maps to offset heights. But I feel like I'll run into issues like gaps that I'm not sure how to resolve. Also not sure if the fact that I'm doing huge spherical terrain (rather than flat "endless" terrain) will make that more difficult.   Did you go off of any good online resources as references?  So you're not relying on a quadtree at all?  The idea of throwing out that nasty recursion is also appealing.     @[member='Hodgman'], I've tried using normal maps (it's a very common recommendation), and I think I was doing it wrong now! I was loading normal data in the vertex shader instead of the fragment shader......  :wacko:  So, if I understand that correctly now, I should probably try that again and see how it looks when done correctly...  I wonder how it'll scale for very large data sets though?  I'm currently using 4096 x 4096 and I'm considering putting even more data in there...
  8. I think what I'm finding is that the normals are really the main (almost the only) issue.  When I don't use normals, I don't notice any popping at all.  The difficulty I'm experiencing with #4 is calculating the dang "tweening" level. I haven't managed to find a way to properly interpolate the normals (or geometry) smoothly from one level to another in the vertex shader without any popping. 
  9. I'm trying to decide what approach to try next to reduce popping and improve visual quality for a project rendering large spherical terrain which pulls terrain detail from heightmaps.    I'm using a quadtree for LOD with skirting to fill gaps. I'm considering using geomorphing to address popping when LOD changes, but early experimentation made me realize that I'm going to need to interpolate more than just the terrain vertex positions. I believe normals will also need to be interpolated between different LODs, otherwise the lighting changes will be so drastic that morphing the geometry won't do much good. The same needs to be done for all the skirting too, otherwise I'll still see gaps and bad lighting. (Not hard, but a little more work.)  I'm calculating vertex normals by simply taking the average of the 6 neighboring triangle's surface normals. (As-is now, every vertex is part of 6 triangles. Meshes are just evenly spaced 32x32 grids.)    What I'm not certain about is, will this approach of interpolating between different LOD normals (in addition to vertex position) be enough to get the job done?  Or will I just end up with some other issue after I've done all the work implementing this?  Another problem I'm seeing (which the above approach should at least mitigate, but maybe not resolve) is that normals at different LODs don't always agree. Skirting does well enough for dealing with the geometry gaps, but that does nothing to address normal disparities. I'm attaching a zoomed-in pic which shows this problem.  You'll have to look closely but you can see that around where the lighting issues are in this picture, the normals are pointing in slightly different directions for bordering meshes at different LOD.  This vanishes entirely once the LODs are all the same, of course.  I'm "padding" my meshes on the edges for purposes of calculating normals, so I don't think that's the reason they're off.  (Actually if that were the problem, it'd still look bad even when LOD matches.)   Ideas to address this that I can think of so far:  Try a new LOD level calculation scheme which attempts to calculate LOD error level and boost the level of detail for areas where error is high. Not sure if this would work well for all types of terrain, though.  Use some other method of calculating normals which isn't so dependent on LOD. I've experimented with pre-calculating normals and using the same for all LODs (essentially, using the normals in play when at the highest level of detail), but that looks like crap. You can have situations where a very small (but steep) slope gives you a lousy vertex normal, totally throwing off the shading for large triangle regions when at low LOD.  Throw out the idea of using uniformly spaced 32x32 grids and try to generate meshes with higher detail where terrain is more complex. This makes skirting more challenging, and I think it'll be much more CPU intensive generating these non-uniform meshes. Also it may cause my physics engine to choke if I end up with odd geometry.  The above gradual geomorphing-style interpolation between LODs... and hope that's good enough. I'd provide my vertex shader with two values for vertex positions, the "actual" position, and an alternate position for every other vertex which moves it towards the center point of its neighbors. Same would apply for normals... I'm leaning towards trying a combination of 1 and 4-- mostly 4. [attachment=35831:Screenshot from 2017-05-07 13-33-31.png]  
  10. @Khatharr Was Adobe blame shifting?  Maybe...  I doubt the info is around anymore due to time passage, but Adobe had some technical reasons for that claim.  Something about restrictive APIs forcing awkward implementations and not having them upgraded to provide needed features, API flaws, or whatever.  I recall reading that Flash can only function as well as the environment it's provided, or something of that nature.     But yeah, maybe you're correct, maybe that was just Adobe taking their turn at poking their finger at the other guy, so you probably have a valid enough point.   I don't really have a dog in a "fight"... the fight is over.  The only thing that bothers me about it is that a lot of people and companies got away with telling half truths to kill what was a perfectly good competing technology back when Jobs put out that "open letter" or whatever it was.  People have become a little too trusting in everything these technology company "authorities" say or claim, they have their own interests and profit margins to look after.  It's kind of a problem in recent years, with developers too--everyone just loves to bash the other guy's tech, and it gets old to me.
  11. 2015 was indeed a bad year (much of it was due to that single highly advertised 'hacking team' incident), although they patched these things in a matter of days, which is extremely fast.  Before 2014 the numbers drop significantly, and the majority look to be only in recent years from what I've seen.  Could be due to Adobe reassigning staff duties and cutting corners, the additional publicity, malware authors getting better or changing tactics, who knows.  To be clear, I'd agree that security is indeed a weakness.  Now we'll never know if that battle could have been sustained, were the platform not under siege by so many competing interests, or if a bigger/better company was behind it.  I'm mostly saying here that we need to at least take these things with a grain of salt, because the harshest critics have ulterior motives.  I'd be curious to see Symantec's methodology for ranking exploits, how the others rank in terms of severity, and what data there is on how much actual damage occurred, all of which may paint a different picture.   In raw numbers, very rough NIST National Vulnerability Database stats (all-time numbers, so keep in mind Flash's age in comparison) look something like this below.  Flash isn't a stand-out among the crowd here, especially considering that for many years the thing was on virtually every PC connected to the Internet, in contrast to most of these.  I would expect far worse numbers with that in mind.   FireFox 1,484 iOS 883 Flash 991 Chrome 1402 Safari 746 (why so many for such a niche product?) Quicktime 329 (Hard to think of a more niche product, why is this so high? Flash's record is far better, especially considering it had 95% market penetration, and it does more than just play video...) The last few years have been pretty terrible for all software really.  Numbers are up drastically all across the board pretty much (I wonder if some of that is due to HTML5).  Main point here, these other guys aren't so perfect either.   Making statements like "it's bad" or "deserves to die" is like beating your old work horse to death after 20 years of helping you put food on the table.  I don't really understand the strong hatred.  The alternatives simply weren't up to the task, and are still quite lacking in some ways.  I'm not in denial that the only path now is to retire Flash, especially from the public Internet, and I've already moved on to other things.   But considering broader context of private/intranet usages (places where security is already mitigated due to limiting or blocking Internet access entirely), it's not as urgent an issue.  A lot of productivity software needs to be re-written, particularly in certain industries like banking/financial (believe it or not), which takes significant time and money.  Some indicators I've seen are that often the replacement software isn't as good, costs more to develop, takes longer, is expensive to maintain, and so on.  Development workflow for a former Flash dev on mobile or web is a painful experience, having experienced a much better platform to develop for before.  The whole situation sucks all around.   The security issues aren't actually what I was referring to when I brought up anti-Flash propaganda though, it's my biggest issue with it too.  Usually when the critics attacked it years ago, they'd have the same 10 things in a list they'd always hit it for, most of which were easily debunked or debatable, so seeing those things endlessly repeated over years wears on you.  Security was probably the hardest to disagree with, but again, considering only that is like tossing out Windows purely for security reasons.  I'm here on Mint and I'm not even suggesting that...
  12. Well, I don't mind discussing the topic, actually I think it's good to hear other perspectives from people with in depth knowledge.  Usually people that don't like Flash I see only know that they don't like ads (easily blocked, and they're usually unblockable HTML/JS crap now anyway) or the stupid "skip intro" movies (extinct now, a good thing).  Flash definitely hasn't been stagnant though!  They've had many new releases over years, adding major features like AS3, JIT compilation, hardware acceleration, multi-threading, new security features... I know I'm forgetting things.  All that stuff was way before W3C got their act together enough to even get specs out for HTML5, so that we could be spammed with new harder to block popup ads.  :)  Makes you wonder, maybe corporations didn't like how Flash was so easily blocked, not good for marketing!  They are the ones running these standards committees, after all. Maybe you're an Apple or a FireFox user (both had questionable motives, particularly Apple, and neither put much effort in to improve integration), and maybe there were more problems in conjunction with those, but who's fault is it if it's working fine in other major browsers?  Hard to say I think.  FireFox has an unusual plug-in architecture which possibly caused issues (I've seen browsers crash too, but almost never due to Flash).  Apple has their own control issues which Adobe had pointed to as a problem.  So I don't really trust those guys, Flash was certainly working well enough to obtain record setting numbers of installs, and embedded itself into commercial software so firmly that it's still there today.  Anything with that kind of popularity will always be a huge exploit target (like Java and Windows) and there will always be many people with issues because of the sheer size of the user base.     In contrast to FireFox and Apple's terrible security records, Flash seems to have done pretty damn good.   I'm pretty sure Flash doesn't come with any adware, I don't remember it ever, so you're probably thinking of Java.  Or otherwise you could have downloaded malware (a fake version of Flash not from Adobe, no wonder it would crash then).  Flash was one of the most successful and popular front-end software platforms in computing history, it's hard to believe that'd happen because it's terrible.   I'm not sure what stats you're looking at for HTML5, but I don't think the HTML5 security stuff is going to just get patched once and go away.  Everything ends up with security exploits eventually.  All that code still has to be maintained for new browsers, new hardware, new drivers, new features, and so on, so new issues will always arise.  Since browsers are now accessing the same lower level API calls that Flash accesses, and implementing many similar features (JIT compilation for JS, hardware acceleration), they've expanded their own attack surfaces.  So I have trouble believing biased critics like Apple when they claim that there's an actual difference here in terms of security.  Especially considering browser security stats I've seen, these guys aren't any smarter than Adobe's devs.   I suppose if platform fragmentation is counted as a positive for security, it might be a small "improvement" (with it's own cost).  If your platform is less consistent and reliable due to fragmentation, it makes malware harder to create (as well as regular software, of course), so I guess there's that.  Not sure that's a great strategy, making all the software suck.  And now the security problems are going to be harder to track and measure; the exploits and security stats will be scattered into browser stats, so it might look like an improvement, but maybe not a real difference.  It's a complex topic for sure, not nearly as simple as so many content-less critical tech bloggers would have us believe.
  13. Flash is the victim of quite a bit of FUD and propaganda (thanks Apple), much of the criticism isn't reasonable, or is no worse than the alternatives when examined objectively.  And of course you have the political/business aspects, Apple had the power to do serious damage to a platform poised to destroy their App Store profits (mostly from gaming, like Flash excels at), so of course they pulled the trigger.  And Adobe, being much smaller with the marketing/communication skills of a chimp (vs. Apple's marketing machine), of course didn't win that popularity contest.  Even with Microsoft building it into their Edge browser, Flash indeed seems to be on it's way out, so I guess there's little point in beating this dead horse topic.  I will add though, the HTML5 security exploits have started, and won't be going away.  Exploit stats will now be harder to gauge due to more fragmented implementations (which brings other fun issues), but security risk will always be there.  Now we'll have to deal with more browser exploits.   AIR is an interesting alternative platform, absolutely great to develop for.  I've read somewhere (can't remember) that supposedly Apple was legally blocked from banning AIR from their devices in Europe.  I'm not really a fan of legal bullying (or other types), but at least it gives people an alternative software platform choice.  AIR isn't supported on Linux, so I can't really get too enthusiastic about it anymore, but Adobe's site does claim a wide range of support with hardware acceleration.  Adobe seems to be focusing on the "casual" gaming market now.     I have zero interest in mobile gaming dev or the casual market, but if I did, I'd have to seriously consider AIR because it's an easy way (outside of something commercial like Unity) to create something cross platform with a single code base.   For the type of things described by the OP I think Unity is probably going to be the most popular. Maybe someone here has proof otherwise, but I'm pretty confident that HTML/JS are not particularly popular for gaming on mobile, or even desktops really.
  14.   Good to know, I'm usually working on Linux (which UE seems to support now), although I don't see why I couldn't toss stuff onto a Win machine if I ever want to distribute something.   
  15. Man, GD.net never disappoints.  Great feedback, thanks!
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!