Jump to content
  • Advertisement
Sign in to follow this  
Paul C Skertich

Can I create multiple Device Contexts?

This topic is 2129 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey everyone. Inside the editor - I have the Level Scene and in a new windows form I have a Material Editor. The Level editor initialized the DX Device and Device Context the attaches to the Level Editor. My question would be if I create a DX device attach it to the the Material Editor Window - would this be a performance issue?

For instance, UDK you have a level editor, then when you open a material editor - you have a model showing what material you are currently editing. Same thing goes with animation editor etc....

Thanks,

Paul

Share this post


Link to post
Share on other sites
Advertisement
There is absolutely no need to create a second device for this, you just create 2 independent scenes which use the same render device/render context for rendering.
As you're building a level editor I'll assume you already have a decent engine as a back-end, so it should be absolutely no problem for you to use this approach.

Share this post


Link to post
Share on other sites
Thanks chief! It's been 3 months that I've been doing DirectX11. Additionally, there's been a 11 years gap of any programming - so I'm refreshing alot of it. The engine is currenly being build along side of the editor. When you say Scene's - what does this mean? Like where I clear the back buffer and depth buffer and begin rendering? The Scene you referred just confused me.

Share this post


Link to post
Share on other sites
If you're building an engine you're going to have to make a very clear separation between low-level operations like manipulating your render device, the rendering pipeline(s) you build on top of that render device, and the scene(s) which feeds data to the pipeline(s). Note that these are all decoupled components.

The render device would be responsible for directly manipulating your low-level graphics API (DirectX 11 in your case) and would be able to directly set render states, bind resources, do draw calls, etc.

The pipeline would be responsible for sending structured commands to your render device, these would be things like 'draw a point light' or 'render a solid textured mesh', etc.

The scene is composed of entities which make up your environment. The level you're building with your level editor is a scene, but the environment in which you apply a material to an object in your material editor will be a scene too, albeit a separate one from the level you're designing. The data in your scene entities related to rendering would be sent through your pipeline so your render device can properly render them.

If these concepts are new or vague to you, you should probably reconsider your plan of building a level editor and an engine and work on something a lot more basic right now. These are definitely not implement-while-you-learn subjects Edited by Radikalizm

Share this post


Link to post
Share on other sites
I was trying to implemend a real time editor what would allow me for now to see how the game would look and feel. You know man? I understand that DX11 uses the render pipeline like - Geometry Shader, Input Assembly and others I can't remember now! Because the managed assmebly of my DLL file attaches itself to a windows form control. That was the part I was getting confused on. The way I have it it attaches the output window to the form1's panel control.

Share this post


Link to post
Share on other sites
Because in a Windows API the game engine just needs to attach itself to one window handle. What I was thinking was well, the game engine all has to do is read the map file created from the editor gather all the entites then do the game logic side from there. Whereas the editor - I can create the scene and how the entites should be textured and what not. I can't leave the output.window to be NULL when I initiaze the swapchain. So that sums up what I was confused about. I was replicating the engine so it can be a real time editor and allow more producitivey.

Share this post


Link to post
Share on other sites
Never mind bro, I think I got this - I am going to make one Managed Assembly for Editor, another managed assembly for different type of editors then I'll be straight. Thus, because the Material Editor has different functions than the level editor. I understand what you told me, though! Thanks for the reply.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!