Hi All
Firstly apologies if this has already been covered here and I have missed it.
The project I am working on is a deep well pump controller.
It uses a loop powered pressure transducer (voltage range 10 to 55V) to measure tank water level (current between 4 and 20mA proportional to water level).
Normally in industrial control to convert from 4-20mA to 1-5V you just measure the voltage a series 250 ohm resistor.
The circuit also has a serial lcd on the uart, amosfet connected to a 5v relay and a standard 6 pin icsp header all not shown here because I don't believe the problem is with them.
The power supply is an ATX PC power supply.
I have approximated the pressure transducer here as a Pot.
Problem is when I connect the Transducer the voltage on my 5v rail drops pretty much right off (approx 2V).
Do I have to isolate my analog input given that it is supplied with a higher voltage?
I cant quite work out what is happening here.
Any help would be much appreciated.
Regards Bruce
Firstly apologies if this has already been covered here and I have missed it.
The project I am working on is a deep well pump controller.
It uses a loop powered pressure transducer (voltage range 10 to 55V) to measure tank water level (current between 4 and 20mA proportional to water level).
Normally in industrial control to convert from 4-20mA to 1-5V you just measure the voltage a series 250 ohm resistor.
The circuit also has a serial lcd on the uart, amosfet connected to a 5v relay and a standard 6 pin icsp header all not shown here because I don't believe the problem is with them.
The power supply is an ATX PC power supply.
I have approximated the pressure transducer here as a Pot.
Problem is when I connect the Transducer the voltage on my 5v rail drops pretty much right off (approx 2V).
Do I have to isolate my analog input given that it is supplied with a higher voltage?
I cant quite work out what is happening here.
Any help would be much appreciated.
Regards Bruce