I recently realized that the textures in my game are all too small for high DPI monitors and need to be scaled up, which also would force me to use compression. And that gave me an idea, because dxt only stores the first and last pixel in a block, and interpolates the rest, it should be possible to compress a 2x2 block instead a interpolate the rest. Then use something like HQ2X or superXBR to scale up the alpha channel.
So I have two questions based on this concept:
How much would starting with a smaller image like this affect quality given that it's already a lossy algorithm?
Is there a library to do this that exists already, or is my best bet to modify squish?