28x2 vref pin's useful voltage is way below the specified minimum.

wapo54001

Senior Member
I've been doing this for years, but got to thinking about describing it here when replying to the "Thoughts on a Unique I2C-DAC Interface" thread. I'm just putting this out there in case someone would find the concept useful.

The minimum voltage specification for the 28x2 vref pin was too high for my needs -- I needed a vref that was very low to handle the extreme low end of a current control range.

I had a requirement to accurately deliver current across a wide logarithmic range from 10ma to 10 microamps with accuracy. I found that I could use the A.3 vref pin far, far below the specified minimum and, since I've now done this across quite a large number of 28x2 chips, I'm confident that it's a characteristic of the chip to support this even if it's outside the specification of the chip.

I have divided the current range into two adc10 ranges, one specified by supply voltage and one specified by the vref voltage that met my needs. The vref voltage I use on pin A.3 is .297v, using a 316R and 4.99K voltage divider off a 5V supply.

Using both ranges with auto-select code it feels like I've achieved something like 12-bit control resolution using a 10-bit chip but I can't do the math to figure it out theoretically so maybe I've got it all wrong. I can, however, confirm that I can control the LEDs in LDRs to accurately drive the LDRs between 25 ohms and 50K ohms with great precision.

Bottom line, I think the 28x2 vref is far more useful than the chip specification suggests, and folks should be willing to try their projects needing a vref that is well below the chip specification.
 

westaust55

Moderator
Thanks Wapo.
Well done for posting. 👍
While I do not have a use at this time it is good to know something new that could be useful in a future project.
 

hippy

Technical Support
Staff member
I found that I could use the A.3 vref pin far, far below the specified minimum and, since I've now done this across quite a large number of 28x2 chips, I'm confident that it's a characteristic of the chip to support this even if it's outside the specification of the chip.
One would have to wonder why, if Vref can be used at substantially lower voltages than the specification, why a higher voltage is specified in the datasheet rather than the lower which would make the chip more attractive to buyers ?

I'm not saying it won't work, but there has to be a reason Microchip state a 2V Vref minimum for the 28X2, not less. I would guess that ties in with "for correct operation..." somehow.

From the comment alongside parameter CV02 "absolute accuracy", that this applies for "ΔVSRC >= 2V", it seems absolute accuracy is not guaranteed to +/- half-LSB when the Vref voltage is lower than 2V.

That may be inconsequential in many or most cases and, if it is, there would seem to be no good reason not to take advantage of the fact that Vref can go lower. One would however have to accept that's presumption on my part, that the datasheet doesn't detail what the actual consequences are or could be.

This could also explain why using the internal FVR at 1V8 for ADC also appears to work, even though the datasheet indicates that should be the 2.048V minimum when used as an ADC Vref.
 

AllyCat

Senior Member
Hi,

Part of the ADC specification is that the ADC is "monotonic", i.e. that if you raise the input voltage very slightly then the reported value will always increase (or stay the same - never decrease) and vice versa. That's quite difficult to achieve when a large number of bits are changing at the same time, e.g. from %0111111111 to %1000000000 , and becomes more difficult as the reference voltages or currents become smaller (relative to the "switches" that are being used in the ADC). So have you tested that your lower reference voltage is still achieving monotonicity?

That said, I have myself, several times on the forum suggested using the FVR1024 as the ADC reference, on the basis that it potentially can give a useful increase in sensitivity. For example, if you are monitoring a current by measuring just a few hundred mV drop across a series resistor, then any lack of monotonicity at the "major" bit changes may be irrelevant (and certainly preferable to the 5 mV resolution using the default supply rail reference).

Cheers, Alan.
 

wapo54001

Senior Member
Alan, you are way over my head. Honestly, I have no idea as to your question, and I haven't tested for monotonicity (which I can only spell by referring to your post). My only "test" has been to have this approach work perfectly in the several dozen 28x2 chips that I have tried it in.

All I can tell you is that I tried this approach as the only possible way I could achieve both the wide current range and high resolution at low current that I needed, and it worked. That second range with the .297V reference gave me the low current with high resolution control that I needed.

I feel that this approach will open new vistas for folks trying to measure very low or very small changes in voltages or current with high resolution. It would be very interesting to know if this approach works equally well with the A.2 reference pin.
 

hippy

Technical Support
Staff member
All I can tell you is that I tried this approach as the only possible way I could achieve both the wide current range and high resolution at low current that I needed
I am not convinced of it being "the only possible way" - Surely it would have been possible to have a higher Vref and amplify the signal you are measuring through an op-amp ?
and it worked
Technically; "appears to have worked". We don't know what issues there may be or in what circumstances it will not work as well as it has appeared to work in this case.

But the main thing is it works well for you. That cannot be discounted.
 

wapo54001

Senior Member
hippy,

The only other entity that I know of that has succeeded in LDR control to my level is an outfit called Tortuga Audio, and they do it with a whole raft of chips including a PIC and multiple 12-bit DACS. I do it with a 28x2 and four mosfets. That other approach is the only one that I know of that has succeeded.

Tortuga Audio

I'll post a picture of my board for comparison when I can get to a different computer.

Yes, rigorously stated, "appears to have worked" is true -- I don't have the capacity to test beyond knowing that it "works for me." However, my result has been 100% successful over many chips with the caveat that my requirement is to reproduce a logarithmic resistance curve that emulates a "perfect" logarithmic pot and so my verification measurements are of the resulting reproducible resistances, not the absolute linearity of the steps within the chip defined by the below-spec voltage reference.
 

Attachments

Last edited:

AllyCat

Senior Member
Hi,

Actually, testing the ADC transfer characteristic is quite easy: For example connect a large electrolytic capacitor (say 100 uF) and a high resistance (say 1 Mohms) from the ADC input pin to ground and charge the capacitor up to above the Reference voltage. Use a simple program loop to read the ADC and send the value to the Terminal. Then "inspect" the data, or copy it to a file to transfer to a spreadsheet or graph-plotting program. The data values should decrease "smoothly" down to zero with no sudden "jumps" (up or down), although there might be some "noise"
I am not convinced of it being "the only possible way" -
Indeed. A common method, used particularly with audio signals, is "oversampling"; typically you would take four ADC10 readings in quick succession and sum the values to give a range of 0 to 4092 or "12-bit resolution". Of course if the hardware were "perfect" then the measurements would all be the same and you'd just get the equivalent of multiplying by 4 (which does NOT increase the resolution), but in practice there is often "noise", so instead of reading say 2 + 2 + 2 + 2 = 8,, you might get 2 + 1 + 3 + 2 = 8 (which happens to be the same). But IF the input voltage were "half-way" between the 2 and 3 levels, you might get 2 + 3 + 2 + 3 = 10 which is effectively an "11 bit" value (i.e. not one that would have been obtained by just multiplying by 4). In practice, the PICaxe ADCs do "suffer" some noise (particularly at supply voltages above about 3 volts) so oversampling really can work.

I've used the method very successfully for measuring the supply voltage of M2 chips to much higher resolution than the normal CALIBADC10 command. My version relies not only on the PICaxe's inherent internal noise, but also additional "randomness" or "noise" introduced by switching the signal via the PICaxe's DAC I won't go into detail here, but the graph attached below may show how the measured ADC values of a gradually falling voltage may appear with various degrees of oversampling and noise (or perhaps with a reduced ADC reference voltage). It comes from post #24 of this long thread which discussed "ADC resolution" and oversampling in some detail.

View attachment 18779

Cheers, Alan.
 

wapo54001

Senior Member
The graph appears to be linear in both axes? I think we have different circumstances and requirements.

I can see how you might achieve higher reading resolution over time, but in my application I need a precise instantaneous value because immediately after reading current through the LDR I then drive a mosfet gate up or down or leave it as-is (pin set to input), and I do this for four separate LDRs per loop of the processor. The individual reading must be accurate and the reading plus correction must be as fast as possible (don't make unnecessary corrections).

My logarithmic read-and-adust control range is 10,000:1 (10ma-to-10ua with 1ua resolution at the 10ua end) to be accomplished with a linear 1023 step adc10. Due to the non-linear LDR response as measured by a linear adc10, there are way more steps than needed at the low resistance end where I could control resistance to a fraction of an ohm, whereas at the high resistance end each step would be many kiloohms, maybe 25K~50K ohms per microamp at 10ua becoming worse below 10ua. At this end a step error of +/- one step of a linear 1023 range would have an unacceptable impact on stability and accuracy and the resolution would be awful.

The saving grace is that in a voltage divider (potentiometer) circuit the lower resistance side has greater control of the output than the higher resistance, and the greater the differential the greater the control. So, tighter control of the lower end of the resistance range gives much greater precision overall than the +/- 25K value would suggest.

I guess the bottom line is that there might be other ways to achieve better precision if time and accuracy-per-reading are not important, but I am not just reading the value, I am also responding to each reading with a commensurate correction in real time so accuracy per reading is necessary. At the beginning of my project I tried to build a stereo pot with a simple linear 1023 step control and it didn't work worth a darn due to the demands of the nonlinear LDR and the required audio taper of the pot.
 
Top