Attached is the first part of some code. I have a sensor which transmits a fault signal in the event of a problem. If there is a fault the sensor transmits the fault signal every 10seconds. When the system is first switched on a red LED flashes and then remains on to indicate a valve is closed. The serin command with a timeout constant set for 16seconds (16,000ms), if no fault signal has been received the code moves onto the main section and the valve is opened etc.
The problem I have is that the code never gets to the 'main' start. If the timeout is reduced to single digit milliseconds it worked with very variable times from near instant to 10s of seconds. If the input to the micro is connected to 0v it works exactly as expected. When connected to the output of the receiver I get the results as described.
The receiver and transmitter are cheap 434MHz modules but they are working correctly in other parts of the code. The output of the receiver without a signal is just logic level noise because without a signal the automatic gain control is set to maximum gain.
Any ideas on how to get around this problem?
The problem I have is that the code never gets to the 'main' start. If the timeout is reduced to single digit milliseconds it worked with very variable times from near instant to 10s of seconds. If the input to the micro is connected to 0v it works exactly as expected. When connected to the output of the receiver I get the results as described.
The receiver and transmitter are cheap 434MHz modules but they are working correctly in other parts of the code. The output of the receiver without a signal is just logic level noise because without a signal the automatic gain control is set to maximum gain.
Any ideas on how to get around this problem?
Attachments
-
750 bytes Views: 7