Posted on the WIKI too, but thought it would reach a bigger audiance if it were posted here too.
Misfire detectionWith the advent of the new firmwares (Currently using 1.1.66, but I have worked in this idea on and off since 1.1.4x) there is the possibility of creating a misfire detection algorithm.
I have made a first attempt at this in order to diagnose an intermittant low RPM misfire I seem to have on my test car.
The algorithm I devised uses the "rtestN" columns from the CSV datalogs.
How is this tooth timing data calculated? And are there any funnies in it relating to missing teeth and sample timings that I might need to account for? (Something for the developers)
This chart shows the standard deviation of the tooth times divided by RPM plotted against RPM. We see that there is considerably more deviation when the RPM is low, implying either there could be a low RPM misfire OR the cam is wild and there is simply a lot of "noise" - charge robbing and the like causing weak combustion. The spread of data at low RPM is to my mind more consistant with an intermittant misfire than simple bad cam behaviour.
Here we see a histogram of the standard deviation of the tooth timings. Note that the distribution appears to be Weibull (possibly even Rayleigh) with a long tail of long tooth timings, implying that I can consider this tail to contain all the misfires.
This shows a datalog done on a dyno with a section of deliberate misfires. The MAD_spark line shows my misfire detector being triggered.
Once I know how the tooth timing data is sampled, I will be able to work out some more useful stuff.
But before that happens, does anyone think I'm proceeding along sensible lines? Or am I barking up the wrong tree again?