I was thinking about this while I was laying under my Jeep:
Feed a 0-5v signal into a 10 bit ADC and you get ~1000 bits of resolution, which ain't bad. But-
What would happen if you took that 0-5v signal and subtracted 2.5v from it with an op-amp?
Now you have 0 to +2.5v and 0 to -2.5v.
Take the 0 to +2.5v portion and multiply by 2x with another op-amp to get 0 to +5v and feed that into one ADC input.
Take the 0 to -2.5v portion, invert it and multiply it by 2x to get 0 to +5v and feed that into another ADC input.
Now you have 1000 bits for half of your signal and another 1000 bits for the other half, or 2000 bits for the whole signal.
This doubles your resolution at the cost of a cheap quad op-amp or two and an equally cheap 7660 flying capacitor chip.
Will this work, or is this the electronic version of Washington's voodo economics?
Feed a 0-5v signal into a 10 bit ADC and you get ~1000 bits of resolution, which ain't bad. But-
What would happen if you took that 0-5v signal and subtracted 2.5v from it with an op-amp?
Now you have 0 to +2.5v and 0 to -2.5v.
Take the 0 to +2.5v portion and multiply by 2x with another op-amp to get 0 to +5v and feed that into one ADC input.
Take the 0 to -2.5v portion, invert it and multiply it by 2x to get 0 to +5v and feed that into another ADC input.
Now you have 1000 bits for half of your signal and another 1000 bits for the other half, or 2000 bits for the whole signal.
This doubles your resolution at the cost of a cheap quad op-amp or two and an equally cheap 7660 flying capacitor chip.
Will this work, or is this the electronic version of Washington's voodo economics?