Sign in to follow this  

OpenGL Time to write rendering engine from scratch

This topic is 1892 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have a new assignment where my boss wants me to write the complete source code for a simple rendering engine that can load and render standard file formats like OBJ (at least one format) but it needs to be written from scratch.

It's been a while since I wrote one from the ground up (15 years or so), in fact it seems hardly anything in graphics is written from scratch these days. The pay is great so I don't have a problem doing it, but am at a loss for an estimate and wanted to get a consensus from those with experience.

How long would you estimate it would take to write an OpenGL program that can load say OBJ files and render models without fancy shaders?

Also, what file format is most straightforward to load and render?

Thanks in advance. Edited by bigneil

Share this post


Link to post
Share on other sites
I agree, it will take time to learn all the stuff necessary to write a serious and correct viewer that works according to the endless amount of specifications you'll need to know.
But, feel free to ask anything in here if you run into any problems..
I would start with GLFW for the openGL window, and go from there, as it is multiplatform (win, linux, mac), and handles the window, inputs and swapping (including vsync) for you.
Unless you also have to do that by hand :)

Share this post


Link to post
Share on other sites
I'd be extremely careful as the goal is not clearly defined. I use AssImp for the loading yet I'd find hard to take less than 5 working days. But perhaps my definition of "without fancy shaders" is different from yours. Edited by Krohm

Share this post


Link to post
Share on other sites
As L. Spiro say, it´s personal. And if you dont have any experiance in coding with either Dx or GL it will probably take a while to figure out how things work.

Use Assimp, it´s easy, it´s good, it´s fast and honestly saved me alot of work and time. it allso supports a ton of formats.
estimate to one month. thats a good start amount, you have room to develop, and to fix bugs, and to correct or rewrite stuff if you are forced to.

Share this post


Link to post
Share on other sites
Generally unsure of myself? 15 years rusty?

And it would take Spiro 2 days and it would take me 2 months?

Who is Spiro trying to impress? A group of anonymous computer nerds?

FYI I've programmed OpenGL every day for the past 15 years, and patented my own rendering system in the process. My own product already served as a rapid prototyping tool and I solved their actual problem in my first few hours.

It just happens they want the source code (a license of which I sell for $250K).

I'm just pointing out that people don't usually write rendering engines from scratch anymore (the one I wrote in 1997 is still working fine thank you), and if someone tells you they can write a rendering engine from scratch in 2 days they are lying.

Share this post


Link to post
Share on other sites
[quote name='bigneil' timestamp='1351866936' post='4996539']
I'm just pointing out that people don't usually write rendering engines from scratch anymore (the one I wrote in 1997 is still working fine thank you), and if someone tells you they can write a rendering engine from scratch in 2 days they are lying.
[/quote]
Your stated requirements were vague, basically "parse an OBJ model and display it to the screen with a basic shader". This as stated can be done very quickly, in a matter of days if you work hard at it. Of course, adding requirements and features will increase the time accordingly, but you did not mention these in your question, they were therefore not mentioned in the answer. [img]http://public.gamedev.net//public/style_emoticons/default/mellow.png[/img]

Your two posts are somewhat inconsistent with each other, is there something you're not telling us? Edited by Bacterius

Share this post


Link to post
Share on other sites
[quote name='bigneil' timestamp='1351866936' post='4996539']
Generally unsure of myself? 15 years rusty?

And it would take Spiro 2 days and it would take me 2 months?

Who is Spiro trying to impress? A group of anonymous computer nerds?

FYI I've programmed OpenGL every day for the past 15 years, and patented my own rendering system in the process. My own product already served as a rapid prototyping tool and I solved their actual problem in my first few hours.

It just happens they want the source code (a license of which I sell for $250K).

I'm just pointing out that people don't usually write rendering engines from scratch anymore (the one I wrote in 1997 is still working fine thank you), and if someone tells you they can write a rendering engine from scratch in 2 days they are lying.
[/quote]

So by reading your first post, this is my toutghs :
- "It's been a while since I wrote one from the ground up (15 years or so), in fact it seems hardly anything in graphics is written from scratch these days. The pay is great so I don't have a problem doing it, but am at a loss for an estimate and wanted to get a consensus from those with experience."

Here you state it was a long time you wrote it from the ground and up. (by just sligtly reading, this is easy read as "i havent done it in 15 years" )
And to add the title states, "Time to write rendering engine from the scratch", this adds more to the idea that you havent done it in a while.

- "How long would you estimate it would take to write an OpenGL program that can load say OBJ files and render models without fancy shaders?"
This gets me to think, you are asking for advice out from experienced developers to get some estimate on how long this task would take, because you sound like a rookie on the sentence befor. ( you might not be a rookie )


- "Who is Spiro trying to impress? A group of anonymous computer nerds?"
I belive none, in fact, i think you got this on the wrong hand. as we got your "Question" on the wrong hand aswell.
If you have been writing OpenGL for 15 years, the you would be pretty sure of how long it would take you to develop that stuff, due to the fact that you know the problems, the issues, and the solutions for it. and you would allso give an planned approach and estimate upon that. Edited by Tordin

Share this post


Link to post
Share on other sites
[quote name='bigneil' timestamp='1351866936' post='4996539']
it would take Spiro 2 days and it would take me 2 months?
[/quote]
I don’t know. How long [i]would[/i] it take you?
That is what we are trying to determine.


[quote name='bigneil' timestamp='1351866936' post='4996539']
I wwebsite as on the internet
[/quote]
You obviously know more about yourself than any of us, but put that aside for a moment and read your post and then my reply.

If you can’t handle the answer, don’t ask the question. If you do ask the question, you should provide all of the necessary information. When estimating how long it would take you to complete a task in OpenGL, it would be a good idea to mention any experience you have in OpenGL up-front.

It’s generally assumed that if you come here to ask a question, you [i]want[/i] to find people who are more experienced than yourself so you can get help from them.
If this wasn’t your thinking, then what was? Did you hope to find a bunch of people saying, “It would take me 3 weeks,” so that you could say to yourself, “What a loser, I am so much better than him or her”?


It sounds as though you already know the answer, so good luck with your project.


L. Spiro Edited by L. Spiro

Share this post


Link to post
Share on other sites
Just to elaborate on L Spiro's first answer (which is quite correct, by the way), your first task is to create a window, get a basic message loop running, and write a basic render function that just clears the screen and does a swap buffers.

Depending on how familiar you are with that process, depending on how many platforms you want to support, depending on whether you need to write platform-native code or are allowed to (or are allowing yourself to) use any helper frameworks, depending on whether you need to learn a new platform from scratch, this could take anything from minutes to weeks or even longer.

It really is that open-ended and you've supplied insufficient info in your original question to allow anyone to correctly gauge your level of knowledge or ability here. 15 years experience in OpenGL means squat when it comes to having to learn the intricacies and pitfalls of a new platform from scratch - even someone like John Carmack still makes basic mistakes on the platform he's most familiar with, so please don't try to imagine that you won't.

From there you need to load and parse a model format. Go back and re-read your opening question again; does it read like the kind of question asked by someone who is familiar with a good variety model formats? Now, .obj is a plain text format so you've got some nasty parsing and conversion ahead of you; have you given any indication that you're aware of this? Do you see now why your open-ended question could elicit an open-ended response?

Back to .obj; like I said, there is some nasty parsing and conversion to be done, but for the most part this is already a solved problem. But you say "from scratch", so for you you may first need to learn the format, then write a loader, then decide how you're going to draw it (as this will greatly influence your loader). Are you going to use immediate mode? Vertex arrays? VBOs? How much experience do you have in these? It's possible to spend 15 years doing OpenGL without ever having gotten within spitting distance of a VBO (especially if the codebase you're most familiar with dates back to 1997) so do you need to learn VBOs from scratch too? You may need to if "it must run fast" is part of your specification (which you haven't clarified).

Again, do you see how open-ended this is?

From there, and since you mention that you're not using shaders (or are you just not using fancy shaders?) I'm going to assume that you're back in familiar territory and can form your own estimate - you're the person in the best position to do that. But do note that I said "assume" here. And also note that I left out anything relating to whether or not the model needs to animate.

As for the simplest model format to load, that would be a proprietary format that you design yourself and that is set up so that you can just read the data directly into the most appropriate in-memory format for rendering from. It could be as basic as a single fread followed by a single glBufferData call - can't get much simpler than that. As a general rule, the more open a model format is the more complex it is as it needs to be able to support the needs of multiple programs doing different types of rendering, but it can virtually always be guaranteed to take the flexibility side of the tradeoff.

Share this post


Link to post
Share on other sites
I'm not sure how "It's been a while since I wrote (a rendering engine) from the ground up" was interpreted as "it's been a while since I programmed graphics (in any capacity)", or who wouldn't take "you sound rusty and unsure of yourself" as less than insulting, but I appreciate the advice. Thanks everyone.

Share this post


Link to post
Share on other sites
Because I was just tasked with writing a rendering engine from scratch I wanted to get a consensus on what people who have written rendering engines more recently thought.

In the end I decided to whittle down my own code since it works perfectly (and recompiles in 6 seconds to all you A-holes who write programs that take 2 hours to compile (see Electronic Arts).

But man, I forgot what a bunch of d*uche b*gs are on this site. Members here (what an appropriate term) pride themselves on their blog rating here and you'll never find a site more likely to neg you for one post that doesn't stroke their ego. Try saying you don't like Goto for example - they'll neg you all the way to negative 1000.

But try asking an advanced question about parametric surfaces and you'll get crickets.

Most of the people here with high ratings don't actually have computer science jobs (or formal experience) which is why they have so much time (and why they are so bitter).

Worse, they provide best-case scenario estimates - an enormous disservice to themselves, coworkers and the industry. A wise man once said "The more I know the less I understand" - and members here understand everything.

If you've worked in the industry you know you'll never find smarter people who are willing to do exactly what they are told (by their boss, the FDA and the TSA) - Developers in a crisis will (according to Meyer Briggs) "go with what the group does".

Oh well, I'd rather be making $90 an hour programming OpenGL for my own business than to have a high db rating, but if you make less be sure to neg me.

Anytime you do something worthwhile you're going to offend someone.

Share this post


Link to post
Share on other sites
Whoa whoa whoa
[quote name='L. Spiro' timestamp='1351813476' post='4996360']
[b]If[/b]...
[b]If[/b]...
[b]If[/b]...
[b]If[/b] you are generally unsure of yourself (and it [b]sounds[/b] as though you are), I would say you should give yourself no less than a month, maybe up to 2.

[b]Because of the wide range of skills and experiences, I am not sure how helpful any of our answers would be[/b].
[/quote]
1) I don't believe LS was tearing you down or diminishing your character. And I'm pretty sure the way you reacted to LS and similar answers set this thread downhill. Clarification on your part probably would have went a long way.
2) [i]Plenty[/i] of people here don't like goto (unless absolutely positively necessary, which it usually isn't).
3) I don't believe you can get a negative rating - perhaps I am wrong about that one though.

Please try to reconsider your perspective on all this - it's not what you think it is.

Share this post


Link to post
Share on other sites
[quote name='achild' timestamp='1352238672' post='4998210']
3) I don't believe you can get a negative rating - perhaps I am wrong about that one though.
[/quote]
No you actually can. Lowest I've seen so far is -350 or so. Not that ratings matter that much anyway. I've come to the conclusion that the thread creator is a troll, and you know what they say about trolls. Edited by Bacterius

Share this post


Link to post
Share on other sites
I can only second L Spiro here.

You ask a question, which it must be assumed was well-intentioned and in good faith.
You get some answers that also read as well-intentioned and in good faith.

From there it seems that you didn't like the answers you got and the resulting conflagration borders on thermonuclear war. Not good.

I'm not saying that you're obliged to like the answers, but there is such a thing as basic manners - people have freely and willingly given of their own time here.

One can only answer a question based on the information contained in that question; if the answers recieved are believed to be inadequate or otherwise lacking then there is obviously a communication deficit of some kind to be addressed. Strikes me that it would have been far more productive to address that deficit than to engage in flaming out.

Share this post


Link to post
Share on other sites
[quote name='Caffeware' timestamp='1352320584' post='4998571']
[img]http://public.gamedev.net//public/style_emoticons/default/huh.png[/img]
[/quote]

Excellent Caffeware - I downloaded it and it ran instantly - this is how all software should be written!

Share this post


Link to post
Share on other sites
Yes! I almost achieved my negative rating!

This shows what a farce it is.

When you're a 20 year veteran of the 3D graphics industry and already worked on Madden Football, Far Cry, Doom, Descent and other games, the rating system doesn't disprove you, you disprove the rating system. Edited by bigneil

Share this post


Link to post
Share on other sites
"Strikes me that it would have been far more productive to address that deficit than to engage in flaming out."

It's the forum that flames out. I was once a member for years and had a rating of 400 or so, then when I said I didn't like Goto my rating went all the way to -400 - OVER ONE THREAD. Edited by bigneil

Share this post


Link to post
Share on other sites
I really do not get why you're reacting this agressively to people's reactions, there was no need for any agression, nobody tried to flame you or attack you in any way whatsoever. You got some valid answers to your question, there may have been some misunderstandings and miscommunications but there's no reason to suddenly react the way you did.
You claim to be an industry veteran of 20 years, if that's true you might want to think about acting somewhat more like an adult then in this situation instead of starting to flame and trying to profile yourself as superior to everyone else because of the projects you have done or (of all things) the compile times you get.

Share this post


Link to post
Share on other sites
[quote name='bigneil' timestamp='1352410597' post='4999033']
When you're a 20 year veteran of the 3D graphics industry and already worked on Madden Football, Far Cry, Doom, Descent and other games, the rating system doesn't disprove you, you disprove the rating system.
[/quote]
I am really getting tired of you and your inferiority complex. You asked a question hoping to find people with more experience than you on a given subject.
I answered because that subject is exactly down my ally. When I go to work, I make a game engine (unless I am going to my other job that is). When I get home, I make a game engine.

[i]Game[/i] engines aren’t [i]rendering[/i] engines, but it just so happens that my current task at work is the rendering part, and on my own engine I am currently working on the rendering part.
[quote name='bigneil' timestamp='1352410597' post='4999033']
Because I was just tasked with writing a rendering engine from scratch I wanted to get a consensus on what people who have written rendering engines more recently thought.
[/quote]
And I am exactly the person you wanted to answer this question. [i]I am writing a damned book about rendering/loading graphics files/etc. as we speak.[/i]

Suddenly you are offended that someone is better than you at this task, yet why wouldn’t I be? It’s what I do all day every day—whatever you do all day every day probably makes you better at that than I am.
So your inferiority complex kicks in and you have to make up for it in other ways—salary, past projects, status, etc.
I have been extremely kind up to this point, having passed up numerous chances to put you in your place, but here is the deal: [i]You don’t impress me.[/i]
There are people here who do, but you will never be one of them. They impress due to their skills and professionalism. Your salary, past projects, and status as a business owner are [b]not[/b] impressive. Once again I am leaving it at that instead of taking you down a few notches, although you do deserve to be put in your place. I can at least say that based on your salary, past projects, and status as a business owner, you have no place to brag, so listing these things as an attempt to make up for your shortcomings in skill only make you out as someone that much more desperate to have lived a better life. Just stop trying to make yourself more than you are, because I can promise you that no respectable person I have ever met has made the kinds of posts you have made, trying to prove themselves to some other random person online. Each time you do so just makes you much less of an impressive person and much more like the kind of person who wishes he or she was more than he or she actually is.

[i]You[/i] are what’s bringing this site’s quality down.
You only come here to flame and brag about yourself, and this topic serves absolutely no purpose to anyone who stumbles upon it in the future. It starts with a very vague question and quickly becomes a 1-man flame-war. Then you make it clear that your goal is only to get a negative reputation (I still haven’t touched your reputation, by the way).
I am not a moderator here, but it is [i]obvious[/i] that if you don’t take a contributory tone starting from your very next post, [i]you will be banned[/i].


L. Spiro Edited by L. Spiro

Share this post


Link to post
Share on other sites
At this point I need to chime in about reputation.

Reputation is NOT there to stroke anyone's ego. Any member of this site with sufficiently high reputation is most definitely NOT getting an internet hard-on (pardon the choice of phrase, couldn't think of a better way of putting it) from it, because they understand the purpose of the system.

The reason for reputation is to act as a service to the community. People can identify the good answers to questions, the ones that are most likely to be the right answer or to at least shove them in the right direction.

A good, fully functioning member will [i]welcome[/i] negative reputation as much as they welcome positive reputation, because they will see negative reputation as something that puts their own previously held assumptions to the test. Because they understand that no matter how much one is experienced, no matter how much one knows, learning is a continuous process.

Share this post


Link to post
Share on other sites
As a slight topic digression, what is the state / benefits to your engine Spiro? I'm always interested in hearing what goes on :)
As a member who frequents another forum devoted to a specific software, I hope I can provide something in the nature of an opinion.
I myself do not work in rendering engines. I have little experience, yet enough to understand that it can become a quickly all-encompassing task.
The issue here is that you failed to address a generic opening. This is the cause of the troubles. If you have a genuine question, ask it again, and perhaps apologise if you feel that you have overreacted. If you feel that this is unfair, then lest this topic be useful to others it should be left. Best of luck to all.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Similar Content

    • By khawk
      We've just released all of the source code for the NeHe OpenGL lessons on our Github page at https://github.com/gamedev-net/nehe-opengl. code - 43 total platforms, configurations, and languages are included.
      Now operated by GameDev.net, NeHe is located at http://nehe.gamedev.net where it has been a valuable resource for developers wanting to learn OpenGL and graphics programming.

      View full story
    • By TheChubu
      The Khronos™ Group, an open consortium of leading hardware and software companies, announces from the SIGGRAPH 2017 Conference the immediate public availability of the OpenGL® 4.6 specification. OpenGL 4.6 integrates the functionality of numerous ARB and EXT extensions created by Khronos members AMD, Intel, and NVIDIA into core, including the capability to ingest SPIR-V™ shaders.
      SPIR-V is a Khronos-defined standard intermediate language for parallel compute and graphics, which enables content creators to simplify their shader authoring and management pipelines while providing significant source shading language flexibility. OpenGL 4.6 adds support for ingesting SPIR-V shaders to the core specification, guaranteeing that SPIR-V shaders will be widely supported by OpenGL implementations.
      OpenGL 4.6 adds the functionality of these ARB extensions to OpenGL’s core specification:
      GL_ARB_gl_spirv and GL_ARB_spirv_extensions to standardize SPIR-V support for OpenGL GL_ARB_indirect_parameters and GL_ARB_shader_draw_parameters for reducing the CPU overhead associated with rendering batches of geometry GL_ARB_pipeline_statistics_query and GL_ARB_transform_feedback_overflow_querystandardize OpenGL support for features available in Direct3D GL_ARB_texture_filter_anisotropic (based on GL_EXT_texture_filter_anisotropic) brings previously IP encumbered functionality into OpenGL to improve the visual quality of textured scenes GL_ARB_polygon_offset_clamp (based on GL_EXT_polygon_offset_clamp) suppresses a common visual artifact known as a “light leak” associated with rendering shadows GL_ARB_shader_atomic_counter_ops and GL_ARB_shader_group_vote add shader intrinsics supported by all desktop vendors to improve functionality and performance GL_KHR_no_error reduces driver overhead by allowing the application to indicate that it expects error-free operation so errors need not be generated In addition to the above features being added to OpenGL 4.6, the following are being released as extensions:
      GL_KHR_parallel_shader_compile allows applications to launch multiple shader compile threads to improve shader compile throughput WGL_ARB_create_context_no_error and GXL_ARB_create_context_no_error allow no error contexts to be created with WGL or GLX that support the GL_KHR_no_error extension “I’m proud to announce OpenGL 4.6 as the most feature-rich version of OpenGL yet. We've brought together the most popular, widely-supported extensions into a new core specification to give OpenGL developers and end users an improved baseline feature set. This includes resolving previous intellectual property roadblocks to bringing anisotropic texture filtering and polygon offset clamping into the core specification to enable widespread implementation and usage,” said Piers Daniell, chair of the OpenGL Working Group at Khronos. “The OpenGL working group will continue to respond to market needs and work with GPU vendors to ensure OpenGL remains a viable and evolving graphics API for all its customers and users across many vital industries.“
      The OpenGL 4.6 specification can be found at https://khronos.org/registry/OpenGL/index_gl.php. The GLSL to SPIR-V compiler glslang has been updated with GLSL 4.60 support, and can be found at https://github.com/KhronosGroup/glslang.
      Sophisticated graphics applications will also benefit from a set of newly released extensions for both OpenGL and OpenGL ES to enable interoperability with Vulkan and Direct3D. These extensions are named:
      GL_EXT_memory_object GL_EXT_memory_object_fd GL_EXT_memory_object_win32 GL_EXT_semaphore GL_EXT_semaphore_fd GL_EXT_semaphore_win32 GL_EXT_win32_keyed_mutex They can be found at: https://khronos.org/registry/OpenGL/index_gl.php
      Industry Support for OpenGL 4.6
      “With OpenGL 4.6 our customers have an improved set of core features available on our full range of OpenGL 4.x capable GPUs. These features provide improved rendering quality, performance and functionality. As the graphics industry’s most popular API, we fully support OpenGL and will continue to work closely with the Khronos Group on the development of new OpenGL specifications and extensions for our customers. NVIDIA has released beta OpenGL 4.6 drivers today at https://developer.nvidia.com/opengl-driver so developers can use these new features right away,” said Bob Pette, vice president, Professional Graphics at NVIDIA.
      "OpenGL 4.6 will be the first OpenGL release where conformant open source implementations based on the Mesa project will be deliverable in a reasonable timeframe after release. The open sourcing of the OpenGL conformance test suite and ongoing work between Khronos and X.org will also allow for non-vendor led open source implementations to achieve conformance in the near future," said David Airlie, senior principal engineer at Red Hat, and developer on Mesa/X.org projects.

      View full story
    • By _OskaR
      Hi,
      I have an OpenGL application but without possibility to wite own shaders.
      I need to perform small VS modification - is possible to do it in an alternative way? Do we have apps or driver modifictions which will catch the shader sent to GPU and override it?
    • By xhcao
      Does sync be needed to read texture content after access texture image in compute shader?
      My simple code is as below,
      glUseProgram(program.get());
      glBindImageTexture(0, texture[0], 0, GL_FALSE, 3, GL_READ_ONLY, GL_R32UI);
      glBindImageTexture(1, texture[1], 0, GL_FALSE, 4, GL_WRITE_ONLY, GL_R32UI);
      glDispatchCompute(1, 1, 1);
      // Does sync be needed here?
      glUseProgram(0);
      glBindFramebuffer(GL_READ_FRAMEBUFFER, framebuffer);
      glFramebufferTexture2D(GL_READ_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
                                     GL_TEXTURE_CUBE_MAP_POSITIVE_X + face, texture[1], 0);
      glReadPixels(0, 0, kWidth, kHeight, GL_RED_INTEGER, GL_UNSIGNED_INT, outputValues);
       
      Compute shader is very simple, imageLoad content from texture[0], and imageStore content to texture[1]. Does need to sync after dispatchCompute?
    • By Jonathan2006
      My question: is it possible to transform multiple angular velocities so that they can be reinserted as one? My research is below:
      // This works quat quaternion1 = GEQuaternionFromAngleRadians(angleRadiansVector1); quat quaternion2 = GEMultiplyQuaternions(quaternion1, GEQuaternionFromAngleRadians(angleRadiansVector2)); quat quaternion3 = GEMultiplyQuaternions(quaternion2, GEQuaternionFromAngleRadians(angleRadiansVector3)); glMultMatrixf(GEMat4FromQuaternion(quaternion3).array); // The first two work fine but not the third. Why? quat quaternion1 = GEQuaternionFromAngleRadians(angleRadiansVector1); vec3 vector1 = GETransformQuaternionAndVector(quaternion1, angularVelocity1); quat quaternion2 = GEQuaternionFromAngleRadians(angleRadiansVector2); vec3 vector2 = GETransformQuaternionAndVector(quaternion2, angularVelocity2); // This doesn't work //quat quaternion3 = GEQuaternionFromAngleRadians(angleRadiansVector3); //vec3 vector3 = GETransformQuaternionAndVector(quaternion3, angularVelocity3); vec3 angleVelocity = GEAddVectors(vector1, vector2); // Does not work: vec3 angleVelocity = GEAddVectors(vector1, GEAddVectors(vector2, vector3)); static vec3 angleRadiansVector; vec3 angularAcceleration = GESetVector(0.0, 0.0, 0.0); // Sending it through one angular velocity later in my motion engine angleVelocity = GEAddVectors(angleVelocity, GEMultiplyVectorAndScalar(angularAcceleration, timeStep)); angleRadiansVector = GEAddVectors(angleRadiansVector, GEMultiplyVectorAndScalar(angleVelocity, timeStep)); glMultMatrixf(GEMat4FromEulerAngle(angleRadiansVector).array); Also how do I combine multiple angularAcceleration variables? Is there an easier way to transform the angular values?
  • Popular Now