• Advertisement
Sign in to follow this  

Unity More Quake 3 map rendering problems

This topic is 2641 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi,

I'm still fighting to get Quake 3 maps rendered properly with Direct3D. As you can read here I have had serious problems with the Bezier patches. Well I didn't quite solve them I just switched to DirectX 10 and the artifacts disappeared. Also I stitched the subpatches together as jyk suggested.
But now I noticed some other problems:

First of all my texture and lightmap coordinates seem all to be wrong:
Texture and Lightmaps are off(Patches in Q3Dm17)
Quake 3 banner is mirrored(Q3Dm17)

The most obvious solution to this problem seemed like inverting the u-texcoord for all vertices:


vertex.texture = D3DXVECTOR2(1.0f - vertices.texCoord.x, vertices.texCoord.y);




But then this(Q3Ctf1, main room) will happen although the Quake 3 banner is rendered correctly.

Another problem which really distrubs me are certain artifacts(Q3Ctf1, flag room).

This happens when you look at the edges between patches and regular faces at a certain angle. You can see that the background is visible.

To sum this up:
-Texture and lightmap coordinates are wrong
-Strange artifacts

Please help me I'm really desperated to get this to work...

[Edited by - Soul Reaver on December 3, 2010 2:23:34 AM]

Share this post


Link to post
Share on other sites
Advertisement
If memory serves, q3a maps do have the lightmap uv coord stored explicitly. I'd say you're doing very wrong things with some alignments or vertex packings so they actually are "right" for coincidence more than everything else.

As for the banner: is this Q3DM17?
If it is, that piece of geometry did apply a shader. If the shader (as in Q3A .shader file) applied tcMod rotate (which is possible) then the texcoords will be all screwed. Did you check the shader?

Same thing for flames. Flames are animated and blended using Q3A shaders you have to parse. One of their features is that they can specify alpha maps independantly from diffuse RGB.
The flame shader in particular is likely additive, for which you correctly output black RGB as vertex color, but it seems the blendfunc is screwed up or the incorrect alpha is being supplied.

As for the final picture, what you're seeing is the typical flickering due to sub-pixel accuracy (a good thing) and FP errors (a bad thing).
Long story short: you must be sure patches are stitched correctly togheter, you cannot tassellate them independantly and hope to not experience that. Have a texture palette with "magnetic" vertices which you compare by some distance like (1/32, 1/64, values like that, on a per-group basis).

Share this post


Link to post
Share on other sites
Thanks for your reply,

the banner doesn't use a shader as far as I can tell, otherwise the application would fail to load the texture and it would appear black, also the effect int of that face equals -1 (just checked that).
Oh and what I meant with the other screenshot wasn't the flames but the texture above which is off if I invert the u-texcoord, if I leave it uninverted it is mapped correctly.

Quote:

Have a texture palette with "magnetic" vertices which you compare by some distance like (1/32, 1/64, values like that, on a per-group basis).

Honestly, I don't get what you mean. Maybe because I don't know what a texture palette is. However I tried to solve this problem by a brute force attack (Just comparing each vertex with each other) a while ago. But as you can expect this resulted in very long load times and I don't even know anymore if it worked.

Oh and I am stitching the subpatches together, the problem is the transition between the patches and regular faces.



Share this post


Link to post
Share on other sites
Quote:
Original post by Soul Reaver
Oh and I am stitching the subpatches together, the problem is the transition between the patches and regular faces.
Just a minor note (and this may or may not be the cause of the problem you're seeing), but be sure that you use the actual end control points for the first and last points of each Bezier curve, rather than evaluating the curve at t = 0 and t = 1 and using the result.

(Realistically, evaluating at exactly 0 and 1 is likely to return the first and last control points exactly, I think. But, depending on how you're computing t over the curve, you can easily end up with a value that's not quite 1 for the last point. Better to just avoid any such problems entirely and use the first and last control points as is, IMO.)

Share this post


Link to post
Share on other sites
Quote:
Original post by Soul Reaver
the banner doesn't use a shader as far as I can tell, otherwise the application would fail to load the texture and it would appear black, also the effect int of that face equals -1 (just checked that).
Then this isn't q3dm17 or your assets are screwed. This screen in q3dm17 applies a shader. Just run ioquake or vanilla to check. If this isn't q3dm17 please say so, because it look pretty much like the screen just below the red armor.
Quote:
Original post by Soul Reaver
Oh and what I meant with the other screenshot wasn't the flames but the texture above which is off if I invert the u-texcoord, if I leave it uninverted it is mapped correctly.
Would you please crank up the brightness? I'm still having difficulties. Are you referring to the black corner just below the jawbone to the top-left corner of the door?
Quote:
Have a texture palette with "magnetic" vertices which you compare by some distance like (1/32, 1/64, values like that, on a per-group basis).
Sorry, my fault here. There's should have been no "texture" in the statement above. I really meant to write "vertex palette" rather than "texture palette".
Quote:
Oh and I am stitching the subpatches together, the problem is the transition between the patches and regular faces.
Excuse me, I really couldn't figure that out. I'd say Jyk is probably hitting the correct solution then.

Do yo mind more elaborations on all your problems? For first, where are the shots taken (map and rough position)?

Share this post


Link to post
Share on other sites
Okay, I changed my function to use the control points when the parameter is close (EPSILON) to 1.0f or 0.0f. Unfortunately it didn't resolve the issue. Thanks for the good guess anyway.

EDIT: It is indeed Q3Dm17 but there is no shader applied. The other map is Q3ctf1 just in the main room. And yes I mean the black areas of the geometry just above the door. Oh I just realized that the inverting process of the texture coordinate isn't even the cause of the glitch (it also happens with uninverted texcoords). Also I changed AddressU to Mirror and this is what I get. I guess this is how it is supposed to be BUT on the other side of the map (the map is symetric) the same piece of geometry is still wrong.

EDIT2: Okay my fault, the inverting proccess IS the cause of the texture glitch in Q3ctf1 (Visual studio refused to rebuild the project and I didn't notice it). So to sum this up:

Uninverted + Wrap: Banner mirrored, texture above door correct (on both sides)
Inverted + Wrap: Banner is correct, texture above both doors is wrong
Inverted + Mirror: Banner is correct, texture above one door is correct, above the other it is wrong

Btw, I added the map name and locations in my first post for each screenshot.

[Edited by - Soul Reaver on December 3, 2010 5:07:36 AM]

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By Innoc uous
      If you want to incorporate noise into your shaders, the Turbulance Library has you covered. Using code I gathered from this library, I made a cginc file that contains all you need to easily implement noise into your unity shaders. Who knows how this stuff works, but man, does it work well!
       
      https://pastebin.com/LLCUpJut
       
      Here is an example of what you can create using these noise functions.
       
    • By Nio Martinez
      I'll be buying a new laptop as my workstation for building games, Mostly 3D but not hard core. 
       
      I'm stuck at choosing between these 2 specs below. Does this really matter and if so, can some one tell my how and why it matters. 
      Choice1:
      Intel core i5-8250U (8th gen Kabylake refresh)(6 MB Smart Cache, 1.6 GHz Base with Turbo Boost up to 3.4 GHz) 4 cores 8 threads
      RAM 8 GB DDR4 (2400 MHz)
      GPU 2 GB DDR5 Nvidia MX150 256 bit
      SSD: yes
      Choice2:
      Intel core i7-7500U 2.70GHz Base Processor (4M Cache, up to 3.50 GHz Boost) 2 Cores, 4 Threads
      RAM 4 GB DDR4 (1800 MHz)
      GPU 2 GB DDR5 Nvidia GeForce 940MX 256 bit
      SSD: No
       
    • By Manuel Berger
      Hello fellow devs!
      Once again I started working on an 2D adventure game and right now I'm doing the character-movement/animation. I'm not a big math guy and I was happy about my solution, but soon I realized that it's flawed.
      My player has 5 walking-animations, mirrored for the left side: up, upright, right, downright, down. With the atan2 function I get the angle between player and destination. To get an index from 0 to 4, I divide PI by 5 and see how many times it goes into the player-destination angle.

      In Pseudo-Code:
      angle = atan2(destination.x - player.x, destination.y - player.y) //swapped y and x to get mirrored angle around the y axis
      index = (int) (angle / (PI / 5));
      PlayAnimation(index); //0 = up, 1 = up_right, 2 = right, 3 = down_right, 4 = down

      Besides the fact that when angle is equal to PI it produces an index of 5, this works like a charm. Or at least I thought so at first. When I tested it, I realized that the up and down animation is playing more often than the others, which is pretty logical, since they have double the angle.

      What I'm trying to achieve is something like this, but with equal angles, so that up and down has the same range as all other directions.

      I can't get my head around it. Any suggestions? Is the whole approach doomed?

      Thank you in advance for any input!
       
    • By devbyskc
      Hi Everyone,
      Like most here, I'm a newbie but have been dabbling with game development for a few years. I am currently working full-time overseas and learning the craft in my spare time. It's been a long but highly rewarding adventure. Much of my time has been spent working through tutorials. In all of them, as well as my own attempts at development, I used the audio files supplied by the tutorial author, or obtained from one of the numerous sites online. I am working solo, and will be for a while, so I don't want to get too wrapped up with any one skill set. Regarding audio, the files I've found and used are good for what I was doing at the time. However I would now like to try my hand at customizing the audio more. My game engine of choice is Unity and it has an audio mixer built in that I have experimented with following their tutorials. I have obtained a great book called Game Audio Development with Unity 5.x that I am working through. Half way through the book it introduces using FMOD to supplement the Unity Audio Mixer. Later in the book, the author introduces Reaper (a very popular DAW) as an external program to compose and mix music to be integrated with Unity. I did some research on DAWs and quickly became overwhelmed. Much of what I found was geared toward professional sound engineers and sound designers. I am in no way trying or even thinking about getting to that level. All I want to be able to do is take a music file, and tweak it some to get the sound I want for my game. I've played with Audacity as well, but it didn't seem to fit the bill. So that is why I am looking at a better quality DAW. Since being solo, I am also under a budget contraint. So of all the DAW software out there, I am considering Reaper or Presonus Studio One due to their pricing. My question is, is investing the time to learn about using a DAW to tweak a sound file worth it? Are there any solo developers currently using a DAW as part of their overall workflow? If so, which one? I've also come across Fabric which is a Unity plug-in that enhances the built-in audio mixer. Would that be a better alternative?
      I know this is long, and maybe I haven't communicated well in trying to be brief. But any advice from the gurus/vets would be greatly appreciated. I've leaned so much and had a lot of fun in the process. BTW, I am also a senior citizen (I cut my programming teeth back using punch cards and Structured Basic when it first came out). If anyone needs more clarification of what I am trying to accomplish please let me know.  Thanks in advance for any assistance/advice.
    • By Yosef BenSadon
      Hi , I was considering this start up http://adshir.com/, for investment and i would like a little bit of feedback on what the developers community think about the technology.
      So far what they have is a demo that runs in real time on a Tablet at over 60FPS, it runs locally on the  integrated GPU of the i7 . They have a 20 000 triangles  dinosaur that looks impressive,  better than anything i saw on a mobile device, with reflections and shadows looking very close to what they would look in the real world. They achieved this thanks to a  new algorithm of a rendering technique called Path tracing/Ray tracing, that  is very demanding and so far it is done mostly for static images.
      From what i checked around there is no real option for real time ray tracing (60 FPS on consumer devices). There was imagination technologies that were supposed to release a chip that supports real time ray tracing, but i did not found they had a product in the market or even if the technology is finished as their last demo  i found was with a PC.  The other one is OTOY with their brigade engine that is still not released and if i understand well is more a cloud solution than in hardware solution .
      Would there  be a sizable  interest in the developers community in having such a product as a plug-in for existing game engines?  How important  is Ray tracing to the  future of high end real time graphics?
  • Advertisement