How to operate with the values inside a reference table depending on external values? [Also: 2D interpolation]

The whole idea is:

I have a table, where I compare the Temperature vs the Measured Sucrose. I got some values of the MS in the lab but depending on the temperature, I need to add or subtract some values (that’s the reason of the existence of the table).

So, for an MS level of 80 at 25ºC, the real value is 80 + 0,38 = 80,38.

But, I don’t always have natural numbers, most of the time, I get numbers like 84,13 and temperatures like 24,5ºC, so is there any way to apply the same reasoning with integer numbers? (maybe with something like the TREND command?)

Sucrose vs Temperature table.ods (25.6 KB)

[There is a serious error to be supposed in cell J22 of your table.]

I probably didn’t understand the question fundamentally. Surely I didn’t understand the term “external value” in the context.

The table has classes of 1K width concerning the temperature, and of 5 units (whatever they are) concerning the MeasuredSucrose.

The second (MS) resolution (in the values) is rather high anyway. Its maximum absolute step is 0.04
Concerning T the resolution is lower. The absolute step is 0.19 at maximum, nearly 5 times the other maximum.

Do you therefore want an interpolation concerning T?

Well, the corrections are widely linear concerning the temperature. However, near 35 °C there are unclear effects making me doubt if linear interpolation is a good idea.

If you want to implement a different interpolation, of one of the types offered for TrendLine lines in charts (e.g.), you would need to specify the type (with good reasons).

Done so you should refine the table based on the chosen interpolation.

Imo, this isn’t easily justified. The class step concerning T should have been chosen smaller for the critical range near 35 °C if the provider had sufficient data to do so. And a (say polynomial) interpolating approximation would have been applied by the “researcher”.

Sorry for the late response, I have issues with the login process that I need to fix.

[There is a serious error to be supposed in cell J22 of your table.]

That is correct, thank you so much, I also corrected the values from T=34ºC and 35ºC if I remember correctly.

I probably didn’t understand the question fundamentally. Surely I didn’t understand the term “external value” in the context.

Sure, I didn’t explain myself correctly:

the command TREND allows to compare one value with a reference table of two columns (see LibreOffice and OpenOffice TREND function explanations for more info).

I’m looking for something similar that can be used with my table (with many columns) so it interpolates in both axis (rows and columns). The “External value” was a poor way of saying “a value that is not included in the table”.

But I’m going one step further, I want to operate the interpolated value to the “external” value.

In other words: I get data of MS at a certain temperature, my values are very precise (±0,01) in comparison with the data in the table (±5). The table is for correcting my data, so I can add/subtract certain values depending on the MS levels and the temperature.

I think one way of doing it could be comparing it with the 4 values (2 of MS + 2 of T) around of it instead of the whole row and the whole column since they are not quite linear by MS levels.

The table has classes of 1K width concerning the temperature, and of 5 units (whatever they are) concerning the MeasuredSucrose.

I don’t understand what do you mean by “1K width”

The second (MS) resolution (in the values) is rather high anyway. Its maximum absolute step is 0.04
Concerning T the resolution is lower. The absolute step is 0.19 at maximum, nearly 5 times the other maximum.

Not sure exactly what do you mean by “resolution”.

Here is an updated version of the ODS file. I tried to correct the values using ={Original value}+SEARCHV(ROUND({Temperature};0);{Values of the table};1+MATCH(MROUND({Original value};5);{MS levels})) but this basically adjust the values to their closest ones in terms of temperature and MS.

[I may be able later to answer in more detail (probably).]

For now:
I basically don’t think it’s a clever idea to interpolate concerning an axis given with a step width of 5 to the high resolution you try to achieve, if there not is explicit information about the approcimation/interpolation model applicable.

Concerning the interpolation along the temperture axis my concerns are different: There is clear evidence that linear interpolation is NOT applicable. The (only) problematic range is near 35 °C, and you should try to get additional information by direct lab-research about that range before you apply any model.

TREND() applied to a whole column of the table I would judge to be a bad idea.
If you need a polynomial approximation for a part of the range, the proper method should be to do it using LINEST() in a dedicated sheet/range, and to refine the table from the results instead of applying an approximation for every single final result.

There is already enough fake precision in the world.

Are you sure the column-ranges of the table looking linear in temperature are real (measured) and not already interpolated? Somebody might have had only sparse data for the larger range, and more detailed data only for a vicinity of 35 °C. Interpolation based on already interpolated data can be a source of seriously misleading results, imo.

Thanks for replying, here’s the updated ODS, I forgot to add it before: Sucrose vs Temperature table.ods (113.6 KB)

I basically don’t think it’s a clever idea to interpolate concerning an axis given with a step width of 5 to the high resolution you try to achieve, if there not is explicit information about the approcimation/interpolation model applicable.

I understand it, although the values don’t change that much between one and the other (eg: from 35 to 40 or from 60 to 65), many columns even share the same values

Concerning the interpolation along the temperture axis my concerns are different: There is clear evidence that linear interpolation is NOT applicable. The (only) problematic range is near 35 °C, and you should try to get additional information by direct lab-research about that range before you apply any model.

As you can see, there’s no problem now (in the updated version) with the 35ºC row. I will try to make more research about the table, but I think (although I might be wrong) it’s based in empirical data, I’m not sure if there’s any formula that holds the data. If there were any formula, that would be the best solution ever.

If you need a polynomial approximation for a part of the range, the proper method should be to do it using LINEST() in a dedicated sheet/range, and to refine the table from the results instead of applying an approximation for every single final result.

Great! I’ll go to the LO help wiki to see how to use the LINEST function :slight_smile: I think that can be a great help.

The table was extracted from an older version of this PDF (Table 5, on Annex 5, page 25).

How did the 35°C-anomailty vanish?

Now you also checked with charts, and we both can see that there must be a very smooth function of two variables describing a scalar field of “trustworthy values” to a high precision. However, you shouldn’t assume the values given for the points occurring in your coordinate system are exact. There will be deviations, or a “superimposed noise”. If you actually think corrections by 0.001 or so are relevant, you need to abandon the idea of interpolating (for what you would need to create a triangulation, imo). “The” overall smooth function also eliminating the noise is needed.

This also means that I’m not the expert to find the appropriate way. Basically tasks of the kind you have will occur everyday everywhere, and differential geometry, approximation theory, and the system of tools used in science and engineering generally, will be developed to a highe level insofar. See https://cragl.cs.gmu.edu/toposmooth2/TopoSmooth.pdf for an example of what’s going on there.

Surely departments of mathematics and science also teach “simple” standard methods to their students, but my studíes in math ended in 1968 -and most of the few I once knew I forgot meanwhile.
Today I don’t know anybody I could ask, and a method I might invent on my own would surely not compete with what “every student” is knowing nowadays.

Therefore I’m out.

You may try another question or search with keywords like
surface, scalar field, two variables (2D), smooth approximation, noise reduction,(also probably) medicine …

I see, thank you so much for all your time :slight_smile: I might rethink if the whole issue is worth haha

Well, I was out, but my mind was not. That’s a bad problem with minds.

You surely know that the concept of polynomial functions also is applicable to cases with more than one variable. Therefore also polynomial approximation comes in sight concerning a scalar field over two variables. The “method of least squares”, and thus LINEST() may be chosen to get such an approximation.

General remarks:
Polynomials aren’t suitable for extrapolation for obvious reasons - except in cases where a theortical basis tells us that the “real law” is polynomial.

Concerning interpolation based on errored empirical data -and even beyond that case- there is a dilemma:

  1. Wanting to get small deviations for the predefined values, you need to choose a high degree. Going to the extreme you may even force the deviations to be all 0. This gravely disregards the fact that there are errors.
  2. A polynomial of high degree can produce large local slopes and quasi-oscillations of any frequency with no justification in reality.
  3. You need to have a sound idea
    concerning the expectable size of the errors.

(From my experience: I never got something useful even with 1D-approx based on empirical data using higher degrees than 5.)

For the given case I assumed errors of magnitude 0.01 expectable and acceptable. Otherwise the lab would have published a next digit.

Experimenting with the data I found that this limitation of deviations is already reached (next to) with degree 3. Everything else you may find by studying the attached sheets.

2Dapproximation.ods (309.4 KB)

1 Like

Thank you for such a great response and I deeply apologize for not responding sooner, I thought I did even when I don’t have much to say other than appreciating your high effort responses.