Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

1087 Excellent

About Kalidor

  • Rank
  1. Quote:Original post by dyerseve ...that would have taken me hours to notice.Throwing in some OpenGL error checking would significantly reduce that time. glNormalPointer generates a GL_INVALID_ENUM error if type is not an accepted value. Using something like GLIntercept could help even more.
  2. You're setting a mipmapping minification filter, but only providing level 0 of the texture (ie: no mipmaps). In this case it is as if texturing were disabled for the corresponding texture unit. See section 3.8.10 in the OpenGL Spec. There may be other issues, but this is the one I always look for first (it is a very common mistake). Also, GL_LINEAR_MIPMAP_LINEAR is not a valid magnification filter. See glTexParameter for more info.
  3. Kalidor

    Part of a texture... transparent ...

    Quote:Original post by SulphurTenM One last question... As it has become apparent that the order in which things are drawn (e.g. if there are two objects, obj1 and obj2, and obj1 is drawn first (and has some transparent features) and obj2 is behind obj1, whenever transparent parts of obj1 overlap with obj2, obj2 two is treated as non-existent... [broken img link] Is there a way around this? I'm working in 3d, so unless I fix the camera, there are gonna to be a high number of occurrences of this type of effect.The standard way of dealing with this is to render all opaque geometry, then render all transparent geometry sorted from farthest to closest. This comes up fairly regularly so if you need more information a search of these forums should help. If you need it to be perfect, you could look into something like depth peeling. Although it is rather slow, it might be worth the cost depending on your needs.
  4. Quote:Original post by InetRoadkill Well it does look like the LUMINANCE_ALPHA floating point is not supported yet... at least not on my system. The error being returned is GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT (0x8CD6) I'm not sure what that means.From the FBO specQuote:The framebuffer attachment point <attachment> is said to be "framebuffer attachment complete" if the value of FRAMEBUFFER_ATTACHMENT_OBJECT_TYPE_EXT for <attachment> is NONE (i.e., no image is attached), or if all of the following conditions are true: ... * If <attachment> is one of COLOR_ATTACHMENT0_EXT through COLOR_ATTACHMENTn_EXT, then <image> must have a color-renderable internal format. ...So if changing the texture's internal format effects whether you receive that error, it seems that LUMINANCE_ALPHA is not yet a color-renderable internal format for your card. And just so you know, those floating-point texture constants spek mentioned come from the GL_ARB_texture_float extension.
  5. Here is the man page for gluBuild2DMipmaps. Take a look at the internalFormat parameter. You're creating a texture with 3 color channels (RGB), so there is no alpha channel. If alpha is missing, when OpenGL expands the color to RGBA it takes the alpha channel to be 1.0.
  6. The Visual Arts forum has a lot resources.
  7. Quote:Original post by ertnec ... Right, got it to display on Windows now by changing: glTexImage2D (GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_RGB, GL_UNSIGNED_BYTE, spritedata); for gluBuild2DMipmaps(GL_TEXTURE_2D, 3, width, height, GL_RGB, GL_UNSIGNED_BYTE, spritedata); ...gluBuild2DMipmaps internally resizes the image data you send to it to the nearest power of two before creating all the mipmap levels of the texture. What this implies to me is that spritedata represents a non-power-of-two sized image and your laptop has a video card/driver combination that supports non-power-of-two textures while the windows machine does not. More information here would help.Quote:Original post by ertnec ... On windows now, its drawing lines at 48 pixels across going vertical... Again, there not apparent on my *nix machine...I'm not sure what you mean. Again, more information/screenshots would help. What are you trying to do, what are you actually doing, what do you expect it to look like, what does it actually look like, etc.
  8. Quote:Original post by yahn ... Basically, I'm trying to create a function that can render a texture blended with a color. For example, I have a string stored in a texture that I want to be able to render as red or blue. ...In that case you most likely don't want to use blending at all. Simply setting the texture environment mode to GL_MODULATE (the default) and then changing the primary color (glColor3f) should do what you want. That will multiply the texture's color by the primary color, so if the texture is white (1,1,1) and you set the primary color to cyan (0,1,1), the resulting color will be cyan as well.
  9. If I'm understanding you correctly, would something like what's being done in this chapter of GPU Gems 3 (search for the Voxelization section) be what you're looking for?
  10. Kalidor

    OpenGL two minor(?) questions

    That's right, but for the special case of transforming the origin ([0,0,0], or [0,0,0,1] in homogeneous coordinates) you can just read directly from the last column of the transformation matrix because the rotation/scaling part of it will have no effect (being multiplied by 0).
  11. Kalidor

    problem w/ blending

    What you are doing is a little off from what you seem to be describing. I think what you want to do is the following - disable blending and enable depth testing - render model - enable blending and disable depth testing - render transparent quad - optionally disable blending if you don't want the axes themselves to be blended - render axes
  12. Kalidor

    Corona - Loading a texture

    Quote:Original post by halogen64 From what I gathered, I tried this. The first texture loads (without proper alpha support) and the second texture is completely white. Can anyone help me out? *** Source Snippet Removed ***For "alpha support" you will need to enable blending and/or alpha testing (see Chapter 6 and Chapter 10, respectively, in the Red Book for more information). Your code looks okay, as far as I can tell (I've never used Corona). I thought I saw you were only setting the minification filter for the first texture object, but then I refreshed and now I see it for both of them so this might not be the problem, but texture parameters are per-texture-object state, so you do need to set them for each texture object (like the code now shows you are doing). What video card do you have? It may not support GL 2.0 or the GL_ARB_texture_non_power_of_two extension, in which case you will need to make sure your images have power-of-two dimensions (not multiple of two). You should also check for OpenGL errors. See Chapter 14 of the Red Book for more information on that.
  13. Kalidor

    Cubemap Textures

    Quote:Original post by xerodsm Ok, well I think I understand what I was missing before... I didn't realize that when you use the line: glBindTexture(GL_TEXTURE_CUBE_MAP, cubemap_tex); you can bind any number of different texture locations to the cubemap and then just set the + and -, x,y and z cubemaps. Well, I thought adding that to my code would help but now when I try to render everything looks black. Below is my code to generate the cubemap. Does anyone see anything wrong? If I take out the "glBindTexture(GL_TEXTURE_CUBE_MAP, cubemap_tex);" line then it will render but all the cubemaps for each different sphere look the same. *** Source Snippet Removed ***The reason they all look the same when you don't bind a specific cubemap is because you will change (overwrite) the default cubemap texture's image data when you generate the cubemap's of each sphere, then you will use be using the default cubemap texture when rendering each time. The code you posted looks "okay," are you checking for OpenGL errors? Can you show where you generate those cubemap_tex ids for each sphere? Also it seems like you don't quite understand OpenGL's texture objects. I recommend reading Chapter 9 of the Red Book for some good information. Some of it is a little out-dated by now though (ie: the "high performance working set" of textures and checking if they're resident or prioritizing them).
  14. Right, I added information about that in my edit, but you were too fast. [grin]
  15. From Section 3.8.16 - Texture Application of the OpenGL SpecsQuote:Texturing is enabled or disabled using the generic Enable and Disable commands, respectively, with the symbolic constants TEXTURE_1D, TEXTURE_2D, TEXTURE_3D, or TEXTURE_CUBE_MAP to enable the one-, two, three-dimensional, or cube map texture, respectively. If both two- and one-dimensional textures are enabled, the two-dimensional texture is used. If the three-dimensional and either of the two- or one-dimensional textures is enabled, the three-dimensional texture is used. If the cube map texture and any of the three-, two-, or one-dimensional textures is enabled, then cube map texturing is used. EDIT:Quote:Originally posted by jorgander My question is, how come the 2D texture is applied to the mesh even though I have the 1D texture bound (which implies the 2D texture is NOT bound)That implies only that the 1D texture target binding was changed from the default 1D texture. There's always a texture bound to each texture target; by default these are the default 1D, 2D, 3D, and cubemap textures (all of which are treated as texture objects with texture name 0). If you bind a new 1D texture object to the 1D target and a 2D texture object to the 2D target, the default 3D and cubemap textures are still bound to their respective targets. If you then change the 1D texture target binding again, the 2D texture target still has the same 2D texture object bound to it as it previously did. Now, with the above quote from the specs, you can see why binding a 2D texture to the 2D target then binding a 1D texture to the 1D target, and while having both 1D and 2D texturing enabled, will still use the 2D texture object. [Edited by - Kalidor on March 6, 2008 12:17:27 PM]
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!