quick question about programming resistors

gengis

New Member
Is there any good reason for duplicating the programming resistors and diode on every project? Is there any disadvantage to just mounting the resistors on my DB9 connector, then using a 3 pin header to program the board?

Seems that's the only way I see it being done, but it seems like wasted effort.

The input pin still needs a pull down resistor, but 100K seems to work fine and doesn't really affect the programming signal.

Another question ... what is the purpose of the 180 ohm resistor on the output pin? I'm assuming its there to limit current. Why? The reved stereo jack program cable shorts it when plugging and unplugging?
 

Technical

Technical Support
Staff member
We don't really consider a 10k + 22k resistor to be much more complicated than just adding a 100k - it's only one more resistor, but your system would work in theory. The mass produced serial cable AXE026 does not have any resistors inside because it was cheaper to manufacture it without resistors inside.

The 180 is optional and just an added precaution against accidental incorrect wiring.
 

hippy

Ex-Staff (retired)
Perhaps the main argument for using the download resistors is that it's not usually much harder to put two resistor and a header on as a single resistor and a header. Even with the BAT diode and the 180R that's not a lot extra and many people leave those off.

A 100K pull-down may work but it's quite a high value and could be prone to noise pickup in some cases, and it will affect the input voltage; the 22K and 100K as a divider will reduce a 5V signal to 4V and can lead to problems.

The best way to configure such a programming adapter is something like a 10K resistor as pull-down, and then buffer the signal in the adapter so there is a 5V signal which can be connected to the serial pin when needed.

The 180R on the output is there to prevent possible short circuit damage. Shorting when removing the program cable is unlikely though it is recommended not to disconnect the cable when powered and the risk of a short increases when multiple connections are made from the same PC.
 

gengis

New Member
I guess I hadn't thought about it

Last time I checked using the troubleshooting option on the program editor Pin 2 was +5.2 when the supply voltage on the axe was +4.5. Someday I'll have to see what the bare unadulterated serial signal really is.

So far I've always run my axes at ~3-4.5 V and never had problems programming them.

I assumed there was way more voltage than the 'axe should ever see on its input, and that was the reason for the 22K series resistor - and the internal diodes would clamp it to the power supply and ground.

RS232 "standard" is plus and minus twelve volts nominal - although that may have changed. Transmit was supposed to be anywhere from 5-15 volts and the receiver should work with anything from 3-25 volts. This all dates back to the selector magnets inside electromechanical teletype machines (I believe). I think the machines themselves had tungsten ballast lamps to regulate the current.

Did the advent of computers change the standard?
 

Ralpht

New Member
RS232 "standard" is plus and minus twelve volts nominal - although that may have changed. Transmit was supposed to be anywhere from 5-15 volts and the receiver should work with anything from 3-25 volts. This all dates back to the selector magnets inside electromechanical teletype machines (I believe). I think the machines themselves had tungsten ballast lamps to regulate the current.

Did the advent of computers change the standard?
Way back in the old days, the standard for true RS232 was plus and minus 21Volts. With -21V being the "mark" and +21V being the "space" .

The standard "changed" to plus and minus 12V around the time single board computers and the PC started getting popular. They all had the usual +5V power supply and as well plus and minus 12 Volts for other peripherals (hard and floppy drives etc) that needed higher voltages.

I think the plus and minus 12V RS232 became the defacto standard for quite a long time and it was "not that long ago" that it became official - can't remember the actual dates, might even be as far back as 20 years ago, but no doubt it can be looked up in Wikki.

Long ago, I used to have a single board computer with a serial line that connected to a terminal at 21V plus and minus, and wouldn't work if the voltage dropped much below 18V.

As well it had the even older 20mA current loop interface to connect to a Teletype. The good old days .......
 
Last edited:

Ralpht

New Member
Flooby -
This all dates back to the selector magnets inside electromechanical teletype machines (I believe). I think the machines themselves had tungsten ballast lamps to regulate the current.
Teletype machines were 20mA current loop devices, not RS232.

I have never run across any that used RS232, but will stand corrected if I'm wrong.

The lamp was used to ensure the current was as close to exactly 20mA as possible
 

gengis

New Member
OK thanks. The 21 volts and 20 ma loops makes more sense for powering magnets.

3 volts is just too low for copper wire runs and 20 ma.

I just got a short tour through the crypto spaces in the navy. Rows of clacking machines that looked ancient and banks of dimly glowing lamps. TTY wasn't my specialty, but I had the clearance and wanted to see what went on.
 

hippy

Ex-Staff (retired)
RS232 "standard" is plus and minus twelve volts nominal ... Did the advent of computers change the standard?
The standard remains the same, it's really that some systems do not use the standard but call it by the standard's name. So "RS232" has become the generic name for serial data with two voltage levels and carries data which conforms to its protocol. RS232 has two parts, the electrical side and the data format, if the data format matches RS232 people tend to call it RS232 even when the electrical part doesn't comply.
 

leftyretro

New Member
OK thanks. The 21 volts and 20 ma loops makes more sense for powering magnets.

3 volts is just too low for copper wire runs and 20 ma.

I just got a short tour through the crypto spaces in the navy. Rows of clacking machines that looked ancient and banks of dimly glowing lamps. TTY wasn't my specialty, but I had the clearance and wanted to see what went on.
I worked on teletype/crypto in the Air Force. We were still using 60MA signal lines in the 60s, where the loop voltage was typically 70-120vdc for a open loop. There was a change to 20ma that most of the commercial/Western Union/ATT liked to use. There was also a bipolar current loop standard where a Mark was a current in one direction and a space was the oppisote current, had much better signal to noise ratio but not used by many.

RS-232 was developed by ATT I think and was never designed to directly drive teletype selector magnets, however there were some newer teletype models that had optional built in RS-232 to current loop convertors inside them. RS-232 was always a wide ranging voltage interface and many companies would fudge, screw with and otherwise find some was to get their devices to send or receive it.

Actually the Picaxe downloading interface is the most bizzare adaption to making a single 5vdc device work with RS-232 that I had ever come across. However it works, it's cheap and simple so what's not to like about it ;) Unless you own a laptop that doesn't like to play with it :p

Lefty
 
Top