My understanding concurs w/the posts above. The long term is some version of a time average of the short term. The averaging algorithm could be designed in different ways, depending on what you wanted to accomplish, so comparing short term to long term might produce unexpected results for any particular experiment. It seems like the two should be more or less consistent for an engine (already warmed to operating temperature) idling in the driveway and nobody is messing w/the accelerator pedal.
Differences between long term and short term fuel trim would be especially noticeable during experiments where the throttle valve and engine load were rapidly changing.
As posted above a -15% fuel trim means the computer is finding it needs to inject 15% less gasoline for that particular operating circumstance to meet the o2 sensor requirement of a proper a/f ratio, compared to the amount it thinks it should inject ignoring the o2 sensor and based (mostly) on the maf and engine coolant temp sensors. A -15% fuel trim could be caused by the injectors injecting more gas than the computer thinks it is injecting (too high of fuel pressure, faulty or sticking injector) , or the actual air flow into the engine is lower than what the maf sensor is saying (I think that could happen if the maf’s wire was coated w/gunk), or the engine coolant temperature isn’t what that sensor is saying. A -15% fuel trim might be entirely normal, if it occurred in response to quick throttle angle change.
In other words the engine-control computer is constantly reading the o2 sensor(s), the maf, and the ECT and it feels those readings should be self-consistent with each other. When they aren’t the computer uses the o2 sensor by itself, which insures the correct a/f ratio, and shows its cognitive dissonance displeasure by outputting a non-zero fuel trim value.