Hi guys, as the title says im testing my temp senders from my flatheads, they seem to react quite different when placed in boiling water. Assumption 1: boiling water should constitute a hot engine. Most senders seem to read around 13,14,15 Ohms at room temperature. One sender took a few minutes to start increasing then made its way up to around 4000 Ohms quite quickly before taking out of the boiling water, at which time it decreases quite sudenly to 30's then 20's and put under cold water returns to 24 Ohms. The other two dont really move out of the teens, going up one or two ohms in boiling water. Im looking for advice from those who know what these senders should read in Ohms, not chuck it in your car and see; not an option unfortunatly. Regards
Thanks to rotorwrench at the fordbarn: They use a bi-metalic strip that is sensitive to heat. There is a heater coil wrapped around the bi-metalic strip. At one end of the senders bi-metalic strip there is a set of contact points. The indicator gauge is built the same way except there is only a heating coil. When power is applied the coil in the gauge heats up and returns the gauge to the cold side. As the sender warms up with coolant temperature rise, its bi-metalic strip starts to flex and causes a momentary opening of the contacts. This action causes the heater coil in the gauge to cool down just a bit and start to pull the indicator needle a slight bit toward the hot side. The opening & closing cycle repeats itself continuously. The cooler the water the less the points open and close and the hotter the water the faster the cycle repeats itself giving a higher temperature indication. When the two pole unit switch opens, the gauge just goes to full hot since it is in effect turned off allowing a rapid cool down of the heater coils. All of this is calibrated to give a full hot indication when the coolant nears the boiling point. Kerby