So if we sample a signal, what we do is sample it at discrete locations, which gives us the digital representation of a signal.
Translating this to a "real-world" example, if I'm sampling a texture in my pixel shader, why do I need to use a reconstruction filter on the sampled signal to use this ? Isn't it enough to just sample it since we are in the world of the computer. Don't we just need a digital representation of the data ? If I sample a texture I can use e.g. a linear sampling function which coresponds to a triangle filter.
Does that mean that the signal is still discretely sampled but being reconstructed with a triangle filter ? The monitor is also outputting digital values, so why do we even need to reconstruct the continuous signal again ?
Also is it correct to think that we only need to reconstruct a signal when we want to resample it ?