Is there any smoothing applied to speedometer and tachometer gauge motors?

Hello fellow car enthusiasts!

I know this topic is crossing a bit to electronics but its still related to cars I think :smiley:

Im trying to make an instrument cluster (speedo, tacho etc) by myself to use it in my car which only has digital speedo. Most cars these days (that use analog gauges) use stepper motors to drive speedo and tacho needles. I have successfully come to a point where I captured some data about my drive from OBDII port and wired up my motors to test rig. Now I dont know how often speed and rpm values are sampled usually, but in my car I can only get them at 180ms intervals from OBDII port. If I ā€œplayā€ captured values to my cluster, the motors jerk (probably because sampling rate is too low). If I play them to BMW E36 cluster however, the movement is much smoother (see attached videos)

Is there any hardcore car enthusiast here that can tell me if any smoothing algoritm or similar thing/trick is used in those clusters so that they dont seems jerky :smiley:?

Many thanks!

I can’t answer your question directly other than to say the BMW display does seem to have some damping/filtering in the cluster based on your test. Based on my own observations, gauges in my cars that use stepper motors have between a LOT and a HUGE amount of filtering.

Why don’t you add your own filtering to the signal? Use a simple 3, 5 or 7 data point averaging in the signal you feed the stepper motors. Fewer the samples, the better or the signal will lag. Lag for speed is not such a big deal. Lag in the tach signal would be a big deal.

1 Like

Since values are coming in so scarcely, this type of filtering would make quite a huge delay in tachometer yea :confused: Also in 540ms (3*180ms if using 3 values for average) a lot can happen, the needle could have already moved over half the gauge and back, but if doing average this would get lost
It must be something about cluster’s actions based on next number that comes in, ie if it far away or close to current position could influence how much acceleration does needle have to do to reach it, if its far maybe its more jumpy than if its close (then it looks ā€œlazyā€). On my cluster it looks like the needle is too eager and arrives at destination value before next value is available hmm
Look at this:
at the end it sloowly stops https://youtu.be/2bPcrHnFDW4?t=7
when changing direction mid path, its much more responsive: https://youtu.be/2bPcrHnFDW4?t=18

The readings still change every 180 ms if you’ve done the filtering correctly. Think about that. You have a stream of data averaging 3 points posting every 180 ms, not every 540 ms. If you don’t like that, use a low pass filter set at about 10 hz.

Considering how fast the speed actually changes… figure 0-60 mph in 3 seconds, or 16.6 data readings, or 3.6 mph (corrected per Keith’s keen observation) per 180 ms reading, I think this will work. Try it, if it doesn’t work, try a low pass filter.

That should be 3.6 mph every 180ms. But the momentum in the movements should smooth out the transactions some what.

1 Like

Smooth vs. perfectly correct…
Concentrate on the tachometer, since the engine is able to change from 800 to 9000 RPM in a very short time. Clearly, even in 180 ms, the engine has made a significant change in RPM. So, even the 180 ms data dump gives smoothing. My question is this: exactly what do you want that tach to read? Are you going to read the tach and use that information in any significant way?
I am going to take a WAG and say the following: it is quite possible that more than smoothing of data is occurring. Since the gauges are getting their data from a computer, that computer is likely to be giving a PREDICTED data. The computer can see the car speed and engine speed rapidly rising, and compute the acceleration and use this to smoothly predict a future value (that is, milliseconds later). Of course, all this makes a disconnect from the actual instantaneous values of RPM and speed…but who cares? We don’t really drive with the expectation to know our mph to three or four significant digits at every given instant during a huge acceleration. Bottom line: we definitely want the computer to give simulated data to the gauges.

Sorry for late reply, my job prevented me from trying this earlier. So I have applied a moving average with size of 3 to my values that I send to motors and result was this:

Now I see there are actually two levels of smoothing here: one is smoothing out the actual readings from the engine, ie if engine ramped up rpm, we expect that needle would only move up. While this, in essence, is happening (rpm in increasing), it could have those small jerks or dips where rpm would would jump up and down a bit multiple times. This would cause needle to not move smoothly up but go down a bit, then up again then down again a bit before reaching it final position where rpm would settle.

The other smoothing is interpolating missing values between readings. In its basic form the needle controller has no idea what should be between two values, it just assumes needle must move from point A to point B and does this without any regard for how this actually looks (cosmetically). This then looks like jerky incremental movements (as seen in video). The solution must be something in line with controller delaying for one value so that it knows where next needle position will be in advance and then calculates the needed acceleration and speed. This way the controller knows not to decelerate if next value will not land in vicinity of previous value and just continue on, thus movement will look smooth. Or if next value lands close to previous value, just a bit more forward, it will know to start decelerating a bit.

Now this sounds all nice and dandy but how to actually do this is another story or if it even has merit

The library I’m using atm has something called accelTable where you can define how needle accelerates. I tried changing it a bit and I think some improvement can be seen but still way off the nice smooth looking movement on real clusters

before
// This table defines the acceleration curve.
// 1st value is the speed step, 2nd value is delay in microseconds
// 1st value in each row must be > 1st value in subsequent row
// 1st value in last row should be == maxVel, must be <= maxVel
static unsigned short defaultAccelTable[2] = {
{20, 800},
{50, 400},
{100, 280},
{150, 170},
{300, 120}};

after
static unsigned short defaultAccelTable[2] = {
{10, 20000},
{20, 10000},
{30, 5000},
{50, 1000},
{100, 600},
{150, 450},
{300, 350},
{500, 290},
{700, 200},
{1000, 120}};

The awesome guy who wrote this library also suggested to try to slow down the needle so that it would travel the desired distance in exactly 180ms (in time for next reading)

I am going to read the tach but cosmetically pleasing looks has priority over complete correctness. Yea its more or less for the looks but I dont want it to be completely off / misleading

I know what you mean with predicted data, but did they do that in cars made in mid 2000s, did they have enough computing power? Or even those BMW dials I have from E36 which is from the late 90s. Probably they do some filtering before reporting this value as ā€œthe speedā€ or the ā€œrpmā€ tho, I agree with you, and I completely agree that totally accurate values should not be the goal

Not too bad. Much smoother than example 1. Good job.

Thanks! Will have to play more with this and see where it leads :smiley:

1 Like

I cannot answer your question as something like this is not anything I’ve ever been involved with.

However, from some past experience with European cars I can say that many of them use a solid state voltage regulator to maintain a steady voltage supply. Maybe voltage ripples are causing fluctuations. We always called them pitchforks. No idea if this will help or not.

image