Реферат: | eng: We address the problem of estimating the computation time necessary to approximate a function on a real computer. Our approach gives a possibility to estimate the minimal time needed to compute a function up to the specified level of error. This can be explained by the following example. Consider the space of functions defined on [0,1] whose absolute value and the first derivative are bounded by C. In a certain sense, for almost every such function, approximating it on its domain using an Intel x86 computer, with an error not greater than epsilon, takes at least k(C,epsilon) seconds. Here we show how to find k(C,epsilon).
|