Excel - how to find some exact f(x) = x from measurement graph - excel

I am calculating an dynamic resistance of a diode and I have a lot of measurements and I've created a graph from them. And the question is, how do I find from this graph an exact value of arguments, for example: I want to obtain f(x) value for x=5 where i have measurement for exact value fe. x=10 -> y=213, x=1 y->110, and got a graph curve, but how to find f(5) = ?

This is not trivial: it will depend on your interpolation scheme and Excel does not expose the scheme it uses when drawing a graph.
Unless you tell it otherwise, Excel (I think) uses a Bezier Curve with 2 control points to perform its graphing.
This interpolation scheme transforms, via some linear algebra, to a cubic spline interpolation.
But to use cubic spline interpolation, you need more than two data points.
Since you've only given us two points, the best thing you can do is to interpolate linearly but that will not be what Excel does.
An answer more detailed than this if anything will epitomise the rather broad nature of your question. Do Google any terms that I've used: armed with a bit of time and a good internet connection, you ought to be able to solve this problem adequately.
See https://en.wikipedia.org/wiki/Spline_interpolation, https://en.wikipedia.org/wiki/B%C3%A9zier_curve

I think that you can use a preinstalled add-on named Solver. You have to activate it as shown here.
Then you have to follow one of the tutorial you can find over the Internet (like this one) without finding min o max but finding the exact value you want.

Related

Most efficient and effective way to create a surface from 3d points

Say I had a point cloud with n number of points in 3d space(relatively densely packed together). What is the most efficient way to create a surface that goes contains every single point in it and lets me calculate values such as the normal and curvature at some point on the surface that was created? I also need to be able to create this surface as fast as possible(a few milliseconds hopefully working with python) and it can be assumed that n < 1000.
There is no "most efficient and effective" way (this is true of any problem in any domain).
In the first place, the surface you have in mind is not mathematically defined uniquely.
A possible approach is by means of the so-called Alpha-shapes, implemented either from a Delaunay tetrahedrization, or by the ball-pivoting method. For other methods, lookup "mesh reconstruction" or "surface reconstruction".
On another hand, normals and curvature can be computed locally, from neighbors configurations, without reconstructing a surface (though there is an ambiguity on the orientation of the normals).
I could suggest Nina Amenta's Power Crust algorithm (link to code), or also meshlab suite, which can compute the curvatures too.

Extrapolate bell shape from a section of curve

I am interested in extrapolating the curve from the population that I know is normally distributed. However, in my process, I am only able to get access to a section of the curve (from -3 standard deviations to -2 standard deviations). My question is what is the best way to fitting a curve to a section of a bell curve.
A normal distribution can be defined by an equation f(x) (its PDF, which is a little difficult to write in non-latex, you can check out the Wikipedia page) with two parameters: the mean and the variance (or standard deviation).
Therefore, if you want to know which variance and mean define it, you just need to solve for the mean and the variance given two known values (which you have infinitely many of, even on a short interval).

How to optimize two ranges for the determination of the intersection point between two curves

I start this thread asking for your help in Excel.
The main goal is to determine the coordinates of the intersection point P=(x,y) between two curves (curve A, curve B) modeled by points.
The curves are non-linear and each defining point is determined using complex equations (equations are dependent by a lot of parameters chosen by user, as well as user will choose the number of points which will define the accuracy of the curves). That is to say that each curve (curve A and curve B) is always changing in the plane XY (Z coordinate is always zero, we are working on the XY plane) according to the input parameters and the number of the defining points is also depending by the user choice.
My first attempt was to determine the intersection point through the trend equations of each curve (I used the LINEST function to determine the coefficients of the polynomial equation) and by solving the solution putting them into a system. The problem is that Excel is not interpolating very well the curves because they are too wide, then the intersection point (the solution of the system) is very far from the real solution.
Then, what I want to do is to shorten the ranges of points to be able to find two defining trend equations for the curves, cutting away the portion of curves where cannot exist the intersection.
Today, in order to find the solution, I plot the curves on Siemens NX cad using multi-segment splines with order 3 and then I can easily find the coordinates of the intersection point. Please notice that I am using the multi-segment splines to be more precise with the approximation of the functions curve A and curve B.
Since I want to avoid the CAD tool and stay always on Excel, is there a way to select a shorter range of the defining points close to the intersection point in order to better approximate curve A and curve B with trend equations (Linest function with 4 points and 3rd order spline) and then find the solution?
I attach a picture to give you an example of Curve A and Curve B on the plane:
https://postimg.cc/MfnKYqtk
At the following link you can find the Excel file with the coordinate points and the curve plot:
https://www.mediafire.com/file/jqph8jrnin0i7g1/intersection.xlsx/file
I hope to solve this problem with your help, thank you in advance!
kalo86
Your question gave me some days of thinking and research.
With the help of https://pomax.github.io/bezierinfo/
§ 27 - Intersections (Line-line intersections)
and
§ 28 - Curve/curve intersection
your problem can be solved in Excel.
About the mystery of Excel smoothed lines you find details here:
https://blog.splitwise.com/2012/01/31/mystery-solved-the-secret-of-excel-curved-line-interpolation/
The author of this fit is Dr. Brian T. Murphy, PhD, PE from www.xlrotor.com. You find details here:
https://www.xlrotor.com/index.php/our-company/about-dr-murphy
https://www.xlrotor.com/index.php/knowledge-center/files
=>see Smooth_curve_bezier_example_file.xls
https://www.xlrotor.com/smooth_curve_bezier_example_file.zip
These knitted together you get the following results for the intersection of your given curves:
for the straight line intersection:
(x = -1,02914127711195 / y = 23,2340949174492)
for the smooth line intersection:
(x = -1,02947493047196 / y = 23,2370611219553)
For a full automation of your task you would need to add more details regarding the needed accuracy and what details you need for further processing (and this is actually not the scope of this website ;-).
Intersection of the straight lines:
Intersection of the smoothed lines:
comparison charts:
solution,
Thank you very much for the anwer, you perfectly centered my goal.
Your solution (for the smoothed lines) is very very close to what I determine in Siemens NX.
I'm going to read the documentation at the provided link https://pomax.github.io/bezierinfo/ in order to better understand the math behind this argument.
Then, to resume my request, you have been able to find the coordinates (x,y) of the intersection point between two curves without passing through an advanced CAD system with a very good precision.
I am starting to study now, best regards!
kalo86

Why is string interpolation named the way it is?

The term interpolation is usually used in mathematical functions when determining a function for given values, which makes perfect sense. I don't see how that applies for strings, what is being interpolated? Am I missing something obvious?
Interpolation in mathematics is simply working out the things between two points(a). For example, cubic spline fitting over a series of points will give you a curve of some description (I consider a straight line to be a degenerate curve here so don't bother pointing out that some formulae generate such a beast) between each set of points, even though you have no actual data there.
Contrast this with extrapolation which will give you data beyond the endpoints. An example of that is seeing that, based on history, the stock market indices rise at x percent per annum so, in a hundred years, will be much higher than they are now.
So it's a short step to the most likely explanation as to why variable substitution within strings is called interpolation, since you're changing things within the bounds of the data:
xyzzy="42"
plugh="abc${xyzzy}xyz"
// now plugh is equal to "abc42xyz"
(a) The actual roots of the word are Latin inter + polare, those translating to "within" and "polish" (in the sense of modify or improve). See here for more detail.

Interpolation technique for weirdly spaced point data

I have a spatial dataset that consists of a large number of point measurements (n=10^4) that were taken along regular grid lines (500m x 500m) and some arbitrary lines and blocks in between. Single measurements taken with a spacing of about 0.3-1.0m (varying) along these lines (see example showing every 10th point).
The data can be assumed to be normally distributed but shows a strong small-scale variability in some regions. And there is some trend with elevation (r=0.5) that can easily be removed.
Regardless of the coding platform, I'm looking for a good or "the optimal" way to interpolate these points to a regular 25 x 25m grid over the entire area of interest (5000 x 7000m). I know about the wide range of kriging techniques but I wondered if somebody has a specific idea on how to handle the "oversampling along lines" with rather large gaps between the lines.
Thank you for any advice!
Leo
Kriging technique does not perform well when the points to interpolate are taken on a regular grid, because it is necessary to have a wide range of different inter-points distances in order to well estimate the covariance model.
Your case is a bit particular... The oversampling over the lines is not a problem at all. The main problem is the big holes you have in your grid. If think that these holes will create problems whatever the interpolation technique you use.
However it is difficult to predict a priori if kriging will behave well. I advise you to try it anyway.
Kriging is only suited for interpolating. You cannot extrapolate with kriging metamodel, so that you won't be able to predict values in the bottom left part of your figure for example (because you have no point here).
To perform kriging, I advise you to use the following tools (depending the languages you're more familiar with):
DiceKriging package in R (the one I use preferably)
fields package in R (which is more specialized on spatial fields)
DACE toolbox in matlab
Bonus: a link to a reference book about kriging which is available online: http://www.gaussianprocess.org/
PS: This type of question is more statistics oriented than programming and may be better suited to the stats.stackexchange.com website.

Resources