Skip to main content

precision - Diagnosing the numerical stability of a function



I have programmed a function f[x] that seems to be very unstable numerically. More precisely, I noticed that for certain arguments in arbitrary precision, the precision of the output is halved compared to the precision of the input. This is very bad.


I would like to visualize the precision of f[x] relative to the precision of x, for different values of x. I am thinking of a plot of f[x] with error bars, but I am not sure how to compute and access these error bars.


This sounds like something might have thought of before, perhaps in a package or in some Mathematica function that I don't know of. In general terms my question is about the best ways to probe the numerical stability of a function f[x]? And hopefully find ways of improving it.




Comments