Jump to content

  • Log In with Google      Sign In   
  • Create Account


reading texture


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
2 replies to this topic

#1 lomateron   Members   -  Reputation: 303

Like
0Likes
Like

Posted 11 March 2012 - 04:09 PM

Is it possible to render to a texture in this way? if there is, how?
The texture is 11x1
The pixel shader will read the 1st pixel and output to it(it can read whatever it wants, the important thing is the output position), then it will jump and read the 3rd pixel(or whatever) and output to it, then it will read the 5ft pixel and output to it,........ until it finishes. I have though on changing the viewport.......propositions plz.

Sponsor:

#2 Dawoodoz   Members   -  Reputation: 294

Like
0Likes
Like

Posted 13 March 2012 - 04:01 AM

You can do it on the graphics card (Like post effects are made with a full screen quad) but the problem is so small that the CPU would be much faster even if you don't read back the result to the CPU.

If you can combine many of your calculations into 512x512 pixels, you can render a quad over the screen and do parallel calculations from one texture to another and then swap pointers to the 2 textures.

My open source DirectX 10/11 graphics engine. https://sites.google.com/site/dawoodoz

"My design pattern is the simplest to understand. Everyone else is just too stupid to understand it."


#3 Hodgman   Moderators   -  Reputation: 28615

Like
0Likes
Like

Posted 13 March 2012 - 04:10 AM

You need two 11x1 textures, because you can't read and write the same texture at the same time. You have to read from one texture and write to a copy of it.

If you want to skip every second pixel, you can either:
* draw a triangle that covers the whole render-target and use SV_Position in the pixel shader to find out which pixel is being executed, and use clip to abort processing of any 'even' pixels.
OR
* draw a list of points that cover exactly the pixels that you want to process.


Also you can't process the 1st pixel, then 2nd pixel, then the 3rd pixel, etc... The GPU launches the processing of all rasterised pixels in parallel, and the order that they're processed is undefined.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS