Most games will only draw lens flare sprites at hand-selected locations in their level (usually light sources). Some games will use a screen-space approach similar to bloom, where bright spots have a filter kernel applied to them and the result is composited over the screen (this is mostly likely what's being done in the screenshot that you posted). This is nice because it affects everything on the screen, but it can be difficult to achieve interesting shapes since you have to achieve them using filtering kernels. It also doesn't let you generate flares for off-screen features. In The Order we use a screen space approach, however we use FFT's to apply the filtering in the frequency domain. This lets you use arbitrary kernels from a texture, which is cool. However the FFT is expensive, so you pretty much have to do it at a low resolution.
With D3D11-level hardware you could certainly try an approach where you analyze the screen, and spawn sprites at bright locations. Append buffers are useful for this, since you can analyze each pixel and throw the bright ones into your append buffer. Then you can use DrawIndirect to render all of the sprites in the buffer. I played around with this a long time ago and it's definitely workable, but at the time I had some trouble making it temporally stable while also keeping the amount of overdraw to a minimum.