Archived

This topic is now archived and is closed to further replies.

Rocket05

OpenGL (GULP!) mixing OpenGL and DirectDraw

Recommended Posts

Rocket05    152
okay, i know there are alot of people out there that dont think it can be done, but i know it can. i know so because people have been able to use OpenGL and allegro (which uses ddraw) together, so i researched it on MSDN and i think i have a way to do it. OpenGL can be setup to render simply to a device context and just use PFD_DRAW_TO_BITMAP instead of PFD_DRAW_TO_WINDOW. the only problem is, i dont know how to setup that device context to hold a bitmap. MSDN''s method just doesnt make sense. anyone know how? and feel free to tell me if this wont work before i go through the headaches of trying to make it work if it wont.

Share this post


Link to post
Share on other sites
Dactylos    122
And I don''t believe it''s a very good idea either. It would probably work on some cards, and not work on others, depending on the driver implementation.

If all you want is a simple (and portable) way of setting up a window for OpenGL rendering use GLUT or SDL.

Share this post


Link to post
Share on other sites
gmcbay    130
Don''t try it, its a lost cause.

You might be able to make it work on some video cards but it will fail on others. And any method you figure out to make it work on more than one card is bound to cause all sorts of inefficiencies that totally negate any benefits. If you want to use OpenGL, use it for 2D drawing as well, there''s tons of tutorials online that explain how to use OpenGL to do 2D blitting using ortho projected quads. (Stay away from the drawPixel interface, its generally slow as hell on most PC OpenGL drivers).

I have no experience with Allegro, but I''d guess (assuming the people who wrote it aren''t crazy) that when its being used with OpenGL it doesn''t actually use DirectDraw, but falls back to GDI or using OpenGL to do its 2D operations. I know this is what SDL does.

Any headaches caused by trying to mix OpenGL and DirectDraw are your own problem, you''ve been warned!

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
To select a bitmap to an device context go like this:

HBITMAP hBitmap = NULL;
HDC hBitmapDC = CreateCompatibleDC(NULL);

hBitmap = (HBITMAP)LoadImage(NULL, "bitmap.bmp", IMAGE_BITMAP, 0, 0, LR_LOADFROMFILE|LR_CREATEDIBSECTION);

SelectObject(hBitmapDC, hBitmap);

Share this post


Link to post
Share on other sites
brettporter    122
The reason not to use DDraw + OpenGL is because some drivers do it internally. You will conflict, it will complain.

See the developer FAQ at OpenGL.org for more information (also available on GameDev).

In conclusion, don''t do it!

~~~
Cheers!
Brett Porter
PortaLib3D : A portable 3D game/demo libary for OpenGL

Share this post


Link to post
Share on other sites
Rocket05    152
okay okay, i knew i was gonna get some negative feedback. however, i know it can be done. your right, there COULD be some problems mixing the two, but if it can be done in two applications (running an OpenGL app in one window and a DDraw App in another), why cant it be done in the same program? btw, when i talked about using WAGL (WinAllegroGL), it does not fallback on all opengl, allegro still uses ddraw.

heres how i plan to do it:

1. Create (in memory, not from loading an image) to an HDC so that opengl can render to it (<-- dont know how to do this)

2. Setup opengl same way as for a window, except use the created HDC instead of using GetDC() for the window''s HDC

3. when choosing a pixel format, use the flags PFD_DRAW_TO_BITMAP | PFD_SUPPORT_OPENGL | PFD_SUPPORT_GDI (do NOT use double buffering or window)

4. use opengl to draw to that bitmap, then when i want the image opengl has drawn, flip the back buffer to a GDI surface, and BitBlt the opengl HDC to the back buffer of ddraw

im pretty sure it will work, albiet slow, and i have no idea if some video cards will be able to handle it while others cant. so what. i wanna see what happens

can anybody tell me how to setup the HDC so opengl can render to it? not load an image to an HDC, im saying actually create it at runtime

Share this post


Link to post
Share on other sites
Julio    116
yes it can be done, but so what? it''s a pain in the ass. you can use OpenGL for 2D stuff just as good as DDraw if that''s the problem.



How many Microsoft employees does it take to screw in a light bulb?
None, they just declare drakness as a new standard.

Share this post


Link to post
Share on other sites
gmcbay    130
Lots of things could be done but shouldn''t. You''d be much better off spending your time trying to invent something new, or learn more useful programming techniques than you would trying to kludge together two APIs that aren''t meant to work together.

Why spend time making something work when the best you can hope for is an unsupported kludge that will run much slower than normal and will likely have unforseen problems with some systems?

And, lastly, the source code for WAGL seems to be available (did a quick search on google), why don''t you just download it and see how they do it first hand?

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
The problem with OpenGL is that you can''t switch between bits-per-pixel using ChangeDisplaySettings() (or whatever the windows function is). You''d have to tell the user that he needed to change it in order to play the game. And if you want both 16-bit and 32-bit mode, that would not be fun.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
HDC CreateDC(
LPCTSTR lpszDriver, // driver name
LPCTSTR lpszDevice, // device name
LPCTSTR lpszOutput, // not used; should be NULL
CONST DEVMODE* lpInitData // optional printer data
);

Share this post


Link to post
Share on other sites
brettporter    122
The reason I gave has been highlighted by most here.

could != should

You could write a hi-tech 3D game in x86 assembly.
But it'll probably break.

Similarly if you try and run DDraw and OpenGL together.

Please don't encourage anyone else to try this - its been shown to be a Bad Idea™ already.

~~~
Cheers!
Brett Porter
PortaLib3D : A portable 3D game/demo libary for OpenGL


Edited by - brettporter on August 5, 2001 5:10:43 AM

Share this post


Link to post
Share on other sites
Zeblar Nagrim    150
I made a evaluation program for OpenGL and Direct3D for a couple a years ago. But I never mixed the two APIs. It was more like a wrapper around those 2 APIs. You must look into com-stuff if you really want to understand how D3D works and how it can be mixed with OpenGL.



Zeblar Nagrim, Lord of Chaos

Share this post


Link to post
Share on other sites
Rocket05    152
alright alright, i cant believe all this negative feedback. i would have thought that more people out there would have tried this already. most of you are saying that it could but not should. could meaning that you just really dont know if its gonna work or not. the reason i want to perform this ''black magic'' is because OpenGL rox for 3d stuff, but using orthogonal view and OpenGL just dont cut it for 2d graphics. it can be done, yes, but it is really a pain not being able to do straight blits() and only having bitmaps in powers of two.

really, who out there has tried this before? i want advice from someone who has tried.

i realize if i wanted to use 3d and 2d stuff at the same time, i should really just use d3d7 because it uses ddraw and i can just access the surfaces really easy. but, i just really dont like the pain that it is to setup d3d7. i could use the d3dx library (which i have before), but i just dont like the feel of it (OpenGL is addictive )

i probably should just let this one go though, because the complexity of the code when mixing the two would just be too complicated for me

thanks for your input though, probably saved me quite a few headaches on a lost cause

Share this post


Link to post
Share on other sites
GameCreator    997
I''d answer your question but I don''t know.

I''d just like to say that I''m disappointed that there are more people repeating others saying that they "don''t think" you should do it than there are people who tell you how. Really sad.

As of now, I think Rocket05 has gotten enough messages saying that some people believe he shouldn''t do it. How about hearing from those people who have actually done this and can help him out, eh?

Share this post


Link to post
Share on other sites
brettporter    122
Hey, Rocket05 read the posts.

Perfectly valid reason not to do it.

I have tried it, and it worked. It gave me exactly zero benefit. On the other hand, people using the same code have stated it broke.

It''s not a pessimistic forum, it''s a realistic forum.

~~~
Cheers!
Brett Porter
PortaLib3D : A portable 3D game/demo libary for OpenGL

Share this post


Link to post
Share on other sites
GameCreator    997
Choose to be either helpful or judgemental.

If you''re judgemental you post opinions.
If you''re helpful you post code.

(Maybe I''m just a sucker for arguments but I hate leaving people unhelped when I know people have the answer.)

You can''t tell someone that what he learns will certainly not help him. You don''t know that. Maybe he''s smarter than you but just needs help to start.

Share this post


Link to post
Share on other sites
gmcbay    130
The thread should really just die, but ...

Those of us who are warning against mixing DD and OpenGL aren't being judgemental, we're just trying to save this guy some trouble.

Imagine if he were mixing two chemicals that were known to be cause an explosive reaction when mixed together. There's a chance he could strike a balance and successfully make these chemicals combine without such a reaction by using some trial and error of different mixing amounts even though doing so doesn't really offer him any real benefit. Would we still be judgemental if we warned this guy that he's setting himself up for trouble?

I don't think so.



Edited by - gmcbay on August 9, 2001 2:57:00 AM

Share this post


Link to post
Share on other sites
evaclear    166
There is a right way and a wrong way to do many things. In this case the mixing of OpenGL and Direct Draw in the mannor you speak of is not "the right way". Unfortunately most people have to experiance "the wrong way" of doing things before they realize why it''s not "right". The reason integrating direct draw and openGL in the mannor you speak of is "wrong" is simple. Your creating more programming work/frustration for yourself, which could be avoided with a better design.

Example 1: Load OpenGL, and Direct Draw. Pray that no conflicts happen. (This is not how allegro works!).

Example 2:
1. Abstract Direct Draw functionality. Using your own function names/structure names.
2. Abstract OpenGL 2d functionality. Using the same names/structures as the Direct Draw ones. (Overloading functions).
3. On startup you can either ask the user which api they would like DX/OpenGL or you can search for a .dll file which contains your DX or OpenGL abstractions. You then load the .dll and it calls the correct DX or OpenGL functions for you.

Obviously example 2 looks like more work. However it''s not when you figure in the debugging time and work arounds example 1 will take.

Example 2 is exactly what Allegro does, as well as what most of the new 3d game engines like Unreal Tourney or LithTech do.

Share this post


Link to post
Share on other sites

  • Similar Content

    • By Zaphyk
      I am developing my engine using the OpenGL 3.3 compatibility profile. It runs as expected on my NVIDIA card and on my Intel Card however when I tried it on an AMD setup it ran 3 times worse than on the other setups. Could this be a AMD driver thing or is this probably a problem with my OGL code? Could a different code standard create such bad performance?
    • By Kjell Andersson
      I'm trying to get some legacy OpenGL code to run with a shader pipeline,
      The legacy code uses glVertexPointer(), glColorPointer(), glNormalPointer() and glTexCoordPointer() to supply the vertex information.
      I know that it should be using setVertexAttribPointer() etc to clearly define the layout but that is not an option right now since the legacy code can't be modified to that extent.
      I've got a version 330 vertex shader to somewhat work:
      #version 330 uniform mat4 osg_ModelViewProjectionMatrix; uniform mat4 osg_ModelViewMatrix; layout(location = 0) in vec4 Vertex; layout(location = 2) in vec4 Normal; // Velocity layout(location = 3) in vec3 TexCoord; // TODO: is this the right layout location? out VertexData { vec4 color; vec3 velocity; float size; } VertexOut; void main(void) { vec4 p0 = Vertex; vec4 p1 = Vertex + vec4(Normal.x, Normal.y, Normal.z, 0.0f); vec3 velocity = (osg_ModelViewProjectionMatrix * p1 - osg_ModelViewProjectionMatrix * p0).xyz; VertexOut.velocity = velocity; VertexOut.size = TexCoord.y; gl_Position = osg_ModelViewMatrix * Vertex; } What works is the Vertex and Normal information that the legacy C++ OpenGL code seem to provide in layout location 0 and 2. This is fine.
      What I'm not getting to work is the TexCoord information that is supplied by a glTexCoordPointer() call in C++.
      Question:
      What layout location is the old standard pipeline using for glTexCoordPointer()? Or is this undefined?
       
      Side note: I'm trying to get an OpenSceneGraph 3.4.0 particle system to use custom vertex, geometry and fragment shaders for rendering the particles.
    • By markshaw001
      Hi i am new to this forum  i wanted to ask for help from all of you i want to generate real time terrain using a 32 bit heightmap i am good at c++ and have started learning Opengl as i am very interested in making landscapes in opengl i have looked around the internet for help about this topic but i am not getting the hang of the concepts and what they are doing can some here suggests me some good resources for making terrain engine please for example like tutorials,books etc so that i can understand the whole concept of terrain generation.
       
    • By KarimIO
      Hey guys. I'm trying to get my application to work on my Nvidia GTX 970 desktop. It currently works on my Intel HD 3000 laptop, but on the desktop, every bind textures specifically from framebuffers, I get half a second of lag. This is done 4 times as I have three RGBA textures and one depth 32F buffer. I tried to use debugging software for the first time - RenderDoc only shows SwapBuffers() and no OGL calls, while Nvidia Nsight crashes upon execution, so neither are helpful. Without binding it runs regularly. This does not happen with non-framebuffer binds.
      GLFramebuffer::GLFramebuffer(FramebufferCreateInfo createInfo) { glGenFramebuffers(1, &fbo); glBindFramebuffer(GL_FRAMEBUFFER, fbo); textures = new GLuint[createInfo.numColorTargets]; glGenTextures(createInfo.numColorTargets, textures); GLenum *DrawBuffers = new GLenum[createInfo.numColorTargets]; for (uint32_t i = 0; i < createInfo.numColorTargets; i++) { glBindTexture(GL_TEXTURE_2D, textures[i]); GLint internalFormat; GLenum format; TranslateFormats(createInfo.colorFormats[i], format, internalFormat); // returns GL_RGBA and GL_RGBA glTexImage2D(GL_TEXTURE_2D, 0, internalFormat, createInfo.width, createInfo.height, 0, format, GL_FLOAT, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); DrawBuffers[i] = GL_COLOR_ATTACHMENT0 + i; glBindTexture(GL_TEXTURE_2D, 0); glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0 + i, textures[i], 0); } if (createInfo.depthFormat != FORMAT_DEPTH_NONE) { GLenum depthFormat; switch (createInfo.depthFormat) { case FORMAT_DEPTH_16: depthFormat = GL_DEPTH_COMPONENT16; break; case FORMAT_DEPTH_24: depthFormat = GL_DEPTH_COMPONENT24; break; case FORMAT_DEPTH_32: depthFormat = GL_DEPTH_COMPONENT32; break; case FORMAT_DEPTH_24_STENCIL_8: depthFormat = GL_DEPTH24_STENCIL8; break; case FORMAT_DEPTH_32_STENCIL_8: depthFormat = GL_DEPTH32F_STENCIL8; break; } glGenTextures(1, &depthrenderbuffer); glBindTexture(GL_TEXTURE_2D, depthrenderbuffer); glTexImage2D(GL_TEXTURE_2D, 0, depthFormat, createInfo.width, createInfo.height, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glBindTexture(GL_TEXTURE_2D, 0); glFramebufferTexture(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, depthrenderbuffer, 0); } if (createInfo.numColorTargets > 0) glDrawBuffers(createInfo.numColorTargets, DrawBuffers); else glDrawBuffer(GL_NONE); if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) std::cout << "Framebuffer Incomplete\n"; glBindFramebuffer(GL_FRAMEBUFFER, 0); width = createInfo.width; height = createInfo.height; } // ... // FBO Creation FramebufferCreateInfo gbufferCI; gbufferCI.colorFormats = gbufferCFs.data(); gbufferCI.depthFormat = FORMAT_DEPTH_32; gbufferCI.numColorTargets = gbufferCFs.size(); gbufferCI.width = engine.settings.resolutionX; gbufferCI.height = engine.settings.resolutionY; gbufferCI.renderPass = nullptr; gbuffer = graphicsWrapper->CreateFramebuffer(gbufferCI); // Bind glBindFramebuffer(GL_DRAW_FRAMEBUFFER, fbo); // Draw here... // Bind to textures glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, textures[0]); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, textures[1]); glActiveTexture(GL_TEXTURE2); glBindTexture(GL_TEXTURE_2D, textures[2]); glActiveTexture(GL_TEXTURE3); glBindTexture(GL_TEXTURE_2D, depthrenderbuffer); Here is an extract of my code. I can't think of anything else to include. I've really been butting my head into a wall trying to think of a reason but I can think of none and all my research yields nothing. Thanks in advance!
    • By Adrianensis
      Hi everyone, I've shared my 2D Game Engine source code. It's the result of 4 years working on it (and I still continue improving features ) and I want to share with the community. You can see some videos on youtube and some demo gifs on my twitter account.
      This Engine has been developed as End-of-Degree Project and it is coded in Javascript, WebGL and GLSL. The engine is written from scratch.
      This is not a professional engine but it's for learning purposes, so anyone can review the code an learn basis about graphics, physics or game engine architecture. Source code on this GitHub repository.
      I'm available for a good conversation about Game Engine / Graphics Programming
  • Popular Now