Author Topic: Help with analog input  (Read 5506 times)

Offline CamB

  • Jr. Member
  • **
  • Posts: 11
  • BHP: 0
Help with analog input
« on: April 23, 2015, 08:28:17 pm »
I seem to be trailing along 6-8 months behind a previous poster in trying to get a Round to do what I want... So I have read this and the help pages etc.

http://www.vemssupport.com/forum/index.php?topic=2310.0

I have 0-5V connected to the analog input (tried with a coolant sensor with a pull-up and also with a rotary switch set up as a voltage divider - essentially a stepped potentiometer). With the sensor it is as suggested in the other thread:

+5V ---- 1000 ohm resistor ---- X ---- sensor ---- gnd

With analog input connected to X, and showing around 3V on a multimeter at room temperature with is consistent with what you'd expect. If I use the rotary switch I can pick anything between 0.7V and 4.3V. So far, so good.

Where I have trouble is the gauge - I see the live input on the sensor scaling graph moving only in a small range (if I understand the graph right), from say 0mv - 300mv with the input calibration at 32 and offset of 0. I can change where it moves by changing the offset, and change the width with which it moves **somewhat** by changing the calibration (but not enough to be accurate), but I don't think this is how it is supposed to work. I would have expected I should be able to move across the full (nearly) 0-5000mv range on the graph using the rotary switch.

It's like the gauge is seeing about 1/10 the expected voltage, basically.

So... Any pointers? What has my limited electrical knowledge failed to understand?

Offline CamB

  • Jr. Member
  • **
  • Posts: 11
  • BHP: 0
Re: Help with analog input
« Reply #1 on: April 23, 2015, 10:09:09 pm »
Here is a pic of the multimeter (nasty cheap one - I know) and Vemstune. The right hand column is intended to be degrees C based on the sensor specifications, 5V and a 1k resistor - I may completely misunderstand how this works.

http://s206.photobucket.com/user/cam_baudinet/media/2002%20Turbo/778f8b635abc453f5e65d40cc4bf08a4.jpg.html

Other notes:

- testing on a bench - neither WB or EGT plugged in
- now I think about it, what the gauge displays for Analog is nothing like the position on the graph (ie, if it really is where the live readout on the graph shows, I'd expect 100ish on the gauge and see 9 - (the ADC?)
« Last Edit: April 23, 2015, 10:14:52 pm by CamB »

Offline gunni

  • Hero Member
  • *****
  • Posts: 1492
  • BHP: 37
Re: Help with analog input
« Reply #2 on: April 24, 2015, 12:04:33 am »
In a case like this I suggest you change it all back to voltages and see if the two read the same.

i.e 0 is 0 and 313 is 313 and so on up to 5000mv

Offline CamB

  • Jr. Member
  • **
  • Posts: 11
  • BHP: 0
Re: Help with analog input
« Reply #3 on: April 24, 2015, 11:13:07 am »
Thanks for the suggestion. I did this (actually reset the input curve to 0-240 and the calibration to 500 / 240 * 32, and moved the decimal point so it shows volts). Together with rereading the old advice on the forum I think I understand it well enough now. I will write it here in case anyone searches later and it helps.

I suspect what I did would have worked fine *if* the analog input took 0-5V in and it stayed as 0-5V in, but with the 4V pull-up (or whatever it's called - sorry I work for a bank and not great with electrical stuff) on my earlier gauge:

- 0V in (grounding the input) is about 2.7V
- with the input disconnected, it's 4V, as you'd expect I guess with the 4V pull-up
- 5V in is about 4.3V (I say "about" because anyone else looking should measure)

So for me this essentially meant using only the points on the calibration graph between 2.7V and 4.3V (ie compressing your calibration curve to fit between 2.7 and 4.3V). Easy enough with a linear sensor and a bit of empirical iteration on the calibration multiplier (easier using a higher multipler as I think this spreads the points out) it was pretty easy to get it how I wanted. An NTC sensor would be more of a challenge just because 4 graph points rather than 16 gives less scope for non-linear slope and it's best to measure / test, but should be fairly accurate.

And you do need to change the button setting on an older one like mine.

Also think I was successful setting the boost control to trigger if the input above goes below a threshold (I am thinking fuel pressure, for example), so pretty happy.
« Last Edit: April 24, 2015, 11:20:15 am by CamB »