I’ve had the 66 for 22 years and normally runs at 210 on the gauge and higher in traffic. That’s according to the gauge. According to my IR thermometer it’s running at 180 at the hottest point in the cooling system when the gauge says 210.
There was an excellent article in the Restorer a few issues back that dove into this, theorizing as the temp goes up, the resistance goes down and the internal resistor (thermistor) adds heat. The theory in the article was that the resistor is not internally thermally bonded to the case, which makes it less likely to be able to dissipate the heat it is generating itself, which lowered the resistance even more making the gauge read too high and climb while the true temp of the engine is stable.
So, it’s cold, foggy and generally a day for experimenting. First I got the car up to temp and made sure the thermostat was open. The gauge read 210 IR thermometer read 180 at the base of the sender. Hottest spot I could find. I shut down and quickly disconnected the lead to the sender and measured its resistance. Engine not running. It read 120 ohms which corresponds to 180 degrees on the table I have.
Then with key on but not running I substituted a resistor box set to 120 ohms and observed the gauge read 180 degrees. WTF?
The other thing I have noticed is the tip of the sender if hotter than the body of the sender when reading with the IR thermometer which would support the theory that the resistor is not thermally bonded to the case properly.
I think, because of that the resistor cools more quickly and by the time I measure the resistance with the power off it had stabilized.
If you do Ohms law with the system voltage ar 14 volts and temp sender at 120 ohms (180 degrees) that resistor has to dissipate 1.6 watts. When the resistance gets down to 80 Ohms (which equates to 210 degrees on my chart) that resistor has to dissipate 2.45 Watts. That’s a lot of power for a resistor to dissipate not in free air with no heat sink.
I did notice that if I measure resistance from unconnected lead to ground it reads 70 Ohm. Clearly it’s reading back through gauge (shunt) but since the gauge reads close across the whole range and the sender reads the correct resistance at 180 degrees and without having a schematic of the whole circuit, I chose to chalk that up to something else I don’t understand. Like how my wife thinks.
I do have an NOS sender I got 20 years ago. I’ve given it the boiling water test and see it’s ok resistance wise there but there was no current going through it so who knows what it will do in the car? One theory from the restorer article was that original and/older senders had better thermal bonding.
There was an excellent article in the Restorer a few issues back that dove into this, theorizing as the temp goes up, the resistance goes down and the internal resistor (thermistor) adds heat. The theory in the article was that the resistor is not internally thermally bonded to the case, which makes it less likely to be able to dissipate the heat it is generating itself, which lowered the resistance even more making the gauge read too high and climb while the true temp of the engine is stable.
So, it’s cold, foggy and generally a day for experimenting. First I got the car up to temp and made sure the thermostat was open. The gauge read 210 IR thermometer read 180 at the base of the sender. Hottest spot I could find. I shut down and quickly disconnected the lead to the sender and measured its resistance. Engine not running. It read 120 ohms which corresponds to 180 degrees on the table I have.
Then with key on but not running I substituted a resistor box set to 120 ohms and observed the gauge read 180 degrees. WTF?
The other thing I have noticed is the tip of the sender if hotter than the body of the sender when reading with the IR thermometer which would support the theory that the resistor is not thermally bonded to the case properly.
I think, because of that the resistor cools more quickly and by the time I measure the resistance with the power off it had stabilized.
If you do Ohms law with the system voltage ar 14 volts and temp sender at 120 ohms (180 degrees) that resistor has to dissipate 1.6 watts. When the resistance gets down to 80 Ohms (which equates to 210 degrees on my chart) that resistor has to dissipate 2.45 Watts. That’s a lot of power for a resistor to dissipate not in free air with no heat sink.
I did notice that if I measure resistance from unconnected lead to ground it reads 70 Ohm. Clearly it’s reading back through gauge (shunt) but since the gauge reads close across the whole range and the sender reads the correct resistance at 180 degrees and without having a schematic of the whole circuit, I chose to chalk that up to something else I don’t understand. Like how my wife thinks.
I do have an NOS sender I got 20 years ago. I’ve given it the boiling water test and see it’s ok resistance wise there but there was no current going through it so who knows what it will do in the car? One theory from the restorer article was that original and/older senders had better thermal bonding.
Comment