I am working on a problem where I want to render 3D objects in pseudo 2D by transforming to NDC coordinates in my vertex shader. The models I'm drawing have numerous components rendered in separate stages, but all components of a given model are based from the same single point of origin.
This all works fine, but each vertex shader for the various stages of the model render redundantly transform from cartesian xyz to NDC coordinates prior to performing work. Instead, I'd like to perform an initial conversion stage, populating a buffer of NDC coordinates, such that all vertex shaders can then just accept the NDC coordinate as input. I'm also looking to avoid doing this on the CPU as I may have many thousands of model instances to work with.
So, with an input buffer containing thousands of Cartesian positions, and an equal sized output buffer to receive transformed NDC coordinates, what is my best options to perform the work on the GPU? Is this something I need to look to OpenCL for?
Being fairly unfamiliar to OpenCL, I was thinking of looking into ways of setting things up so that the first component to be rendered for my models will 'know' it is first, have the vertex shader do standard transform to NDC, and somehow write the results back to an 'NDC coord buffer '. All subsequent vertex shaders for various model components would use the NDC coord buffer as input, skipping the redundant conversion effort.
Is this reasonable?