We address the problem of estimating the computation time necessary to approximate a function on a real computer. Our approach gives a possibility to estimate the minimal time needed to compute a function up to the specified level of error. This can be explained by the following example. Consider the space of functions defined on [0,1] whose absolute value and the first derivative are bounded by C. In a certain sense, for almost every such function, approximating it on its domain using an Intel x86 computer, with an error not greater than ε, takes at least k(C, ε) seconds. Here we show how to find k(C, ε).
|Number of pages||5|
|Journal||International Journal of Foundations of Computer Science|
|Publication status||Published - 25 Jan 2015|
- Computational complexity
- information theory
- performance evaluation