Different 2d interpolation functions

This is a forum for discussing the development and testing of alpha MS2/Extra code. Documentation
(Runs on MS2 and Microsquirt)

Moderators: jsmcortina, muythaibxr

Post Reply
robs
Master MS/Extra'er
Posts: 564
Joined: Sun Jan 17, 2010 4:26 pm
Location: Sydney, Australia

Different 2d interpolation functions

Post by robs »

This is something I posted to the B&G forums about 5 years ago. It's somewhat relevant to the Whittlebeast topic, but might be of more general interest. FWIW, I did write tuning software which used this model, and it did work and homed in pretty quickly on a target AFR. Unfortunately it was all wired up to the B&G data stream and I haven't made the substantial effort to update it to use the INI file.
[preamble snipped...]\
Should the Megasquirt VE table be looked up differently? It currently uses a weighted average value from points on a rectangular grid. A TIN (Triangulated Irregular Network) is a reasonably standard way to model 3D surfaces in digital terrain models and might have advantages when it comes to modelling the 3D VE surface.

While some of the power of TINs is in their "irregularity" (i.e. you can have more sample points where the surface is changing fastest), the simplest way to use them for MS would be to divide each of the rectangles in the VE table down a diagonal and treat the resulting two triangles as perfectly flat facets of the VE surface.

Any three points is, by definition, on a plane. As soon as a fourth point is introduced, you start having to deal with ambiguities about what shape you want. If the SW, SE and NW corners are all at 30 and the NE corner is at 40, what is the *right* value for the centre?

The chief argument in favour of a weighted average is that it gives a smoother surface. I don't dispute that, but it won't always be a better surface. In the case above, the MS algorithm would give it 32.5 (rounded) on the basis of three "votes" for 30 and one for 40. The TIN would have it still stuck on 30. Sounds like the TIN's not as good. Fair enough. But in the case where you have three points on 80 and one on 70, the weighted average would lean the middle point by 2.5. The TIN would keep it at 80 and that could well be better.

Thing is, a TIN is completely unambiguous. Every point on the surface is determined by the three points enclosing it. That the weighted average gives a smoother result is (IMO) jumbling things up. It's the job of tuning to make the surface smooth. When the tuning's right, the surface stored in the ECU shouldn't need smoothing every time you look at it.

And it's in tuning where the TIN should really shine. If the centre of the triangle is too lean, lift up all three vertices. In the middle of an edge? Just raise its two ends. Near a corner? Raise that vertex. All of these adjustments have easily understood effects. The same can't be said if you applied these rules to the four corners when a weighted average is going to be used. Depending on the existing values, a lifting will introduce (or reduce) a ripple of some sort (a beautifully smooth one of course). Which, I guess, is why MegaTune and AMC shy away from adjustments when the current running point isn't near a vertex.

I'm thinking of writing my own AutoTune style program to implement just the tuning aspects of this. It would banish the "Tuning point not near vertex" message, and the result should still be pretty close to right even if MS won't interpret it as a TIN.

Still, I think there could be real merit in MS itself moving away from the weighted average to a simpler surface model. The implementation of this wouldn't be hard, at least for the lookup. A new version of intrp_2dctable() looks like it would do. Changes to AMC would be more involved, but manageable I think. Then there is all the ancillary software (MegaTune, etc.) which might need updating to understand the different interpretation of the VE grid. Once again, it sounds like it wouldn't be too onerous. As for using TINs to their limit -- truly irregular triangles that might automatically subdivide in areas where a flat surface won't work -- I suspect this would need too much RAM for the current hardware to support it, and all software would need plenty of work to understand it. Pretty much out of the question I expect.

Anyone still awake?

Have fun,

Rob.
Post Reply