Ok this is probably a really dumb question but I have been trying to do this for a while but every time I do I keep on failing to get a - 1.

So here is how I got to where I am at now...

Vout = Voltage Output

Vref = Applied voltage

R = variable resistor

ADC = value generated by a Analog to Digital converter.

Vout = R / ( R + 10k ) * Vref // Here the Voltage Output is the voltage after going through a voltage divider

ADC = ( Vout / Vref ) * 1023

so..

ADC = { [ R / ( R + 10k ) * Vref ] / Vref } * 1023

gives us

ADC = [ R / ( R + 10k ) ] * 1023

After doing the math I am getting

R = 10k / (1023 / ADC)

The issue is with the equations I find online they are giving

R = 10k / (1023 / ADC - 1)

I can't for the life of me find where this - 1 is comming from.

Any thoughts?