OpenGL From Wavefront .mtl file to Opengl?

Recommended Posts

I'm rewriting my Wavefront .obj loader and am finally implementing the .mtl loader with it. My question is what information in the MTL file is actually usable by OpenGL?

The following is a simple breakdown of the information available in an .MTL:

Ns = Phong specular component. Ranges from 0 to 1000. (I've seen various statements about this range (see below))
Kd = Diffuse color weighted by the diffuse coefficient.
Ka = Ambient color weighted by the ambient coefficient.
Ks = Specular color weighted by the specular coefficient.
d = Dissolve factor (pseudo-transparency). Values are from 0-1. 0 is completely transparent, 1 is opaque.
Ni = Refraction index. Values range from 1 upwards. A value of 1 will cause no refraction. A higher value implies refraction.
illum = (0, 1, or 2) 0 to disable lighting, 1 for ambient & diffuse only (specular color set to black), 2 for full lighting (see below)
sharpness = ? (see below)
map_Kd = Diffuse color texture map.
map_Ks = Specular color texture map.
map_Ka = Ambient color texture map.
map_Bump = Bump texture map.
map_d = Opacity texture map.
refl = reflection type and filename

Ambient, Diffuse and Specular numbers I understand as well their related color texture and bump texture maps. But Phong specular component? Dissolve factor, refraction index?, etc. ? Are these just numbers used only by modeling programs or are they usable in OpenGL as well?

Share on other sites
You could probably write a shader and use it although, I'm not exactly sure what their purpose is. It is a proprietary format (not a open source format) so you should ask the company who created it.
Why don't you use a more well known format like DirectX .x or just role out your own?

Share on other sites
I started with .obj because it's an easy format to use for someone like me with no artistic ability what so ever. Simple models, simple loading. I'll cover more formats eventually.

Share on other sites
Do you really need those stuff?
Everything is "usable" in openGL, whatever that means... You can control pretty much anything you want. Obviously, you have to code stuff...

Just because a format supports a lot of stuff doesn't mean you have to deal with all those stuff if you don't want to. Unless you want to write a 3D modelling software. Is that your goal?

I don't really understand the question to be honest...

Share on other sites
If you write a shader that implements their lighting/material equations, then of course all of those numbers will be usable by it [img]http://public.gamedev.net/public/style_emoticons/default/wink.gif[/img]

Share on other sites
Those stuff aren't implemented in openGL by default, they are not part of the fixed function pipeline's API. You have to write shaders on your own to deal with them (well, these statements are imprecise at best, but you get the idea).

If that's what you are asking. If you are just interested in those stuff, I'm pretty sure a google search would suffice.

Share on other sites
These "material" values are tied up with lighting model and renderer used. I don't know how exactly Wavefront's rendering works, but some things are more or less common:
Phong specular component is used in Phong per-pixel lighting to calculate specular color (e.g. "shiny spots" on metal). It's analogous to shininess exponent used by OpenGL's fixed pipeline.
Dissolve factor is pretty much explained, you can use it for e.g. alpha blending to make object semi-transparent.
Refraction index - well, that's a question about material modeling and ray-tracing. Basically, refraction is a property of material to change the direction of light that passes through it. Imagine modeling a thick window glass or a windshield. While good quality glass would only displace objects a little bit, through poor quality glass you'd perceive outside objects (severely) distorted. Refreaction is a basis of optical tools as well.
Illum - judging from description, it just controls the exact lighting model Wavefront's tools need to use for this material. E.g. for matte objects it's useless to calculate specular highlights, hence lighting model may be simplified and/or optimized.
Sharpness - can't tell for sure in this case, but my guess is it controls sharpness of specular highlights. While specular exponent controls overall "focus" of the highlights (i.e. how big they are), sharpness is sometimes used to vary their "blurriness". Imagine two metallic spheres, one of which is polished, surrounded by several light sources. Not only the highlights on the polished one will be smaller, they will also be "crispier".

Overall, as has been mentioned, what values from this pool to use depends on your desired lighting model. If you don't use your specific model and stick with OpenGL's fixed pipeline, you're left with Gouraud shading at most, but specular component and dissolve factor are still applicable to it.

EDIT: overquoting Edited by capricorn

Share on other sites
compare this: if you ahve a model fro obj file like soccer ball, without mtl file it is only one color soccerball but with mtl file you see black and white shapes

Create an account

Register a new account

• Partner Spotlight

• Forum Statistics

• Total Topics
627667
• Total Posts
2978539
• Similar Content

• Both functions are available since 3.0, and I'm currently using glMapBuffer(), which works fine.
But, I was wondering if anyone has experienced advantage in using glMapBufferRange(), which allows to specify the range of the mapped buffer. Could this be only a safety measure or does it improve performance?
Note: I'm not asking about glBufferSubData()/glBufferData. Those two are irrelevant in this case.
• By xhcao
Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness.
• By cebugdev
hi guys,
are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic
let me know if you guys have recommendations.
• By dud3
How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below?
Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.

References:
Code: https://pastebin.com/Hcshj3FQ
The video shows the difference between blender and my rotation:

• By Defend
I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
* make lots of VAO/VBO pairs and flip through them to render different objects, or
* make one big VBO and jump around its memory to render different objects.
I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?

• 10
• 10
• 10
• 12
• 22