Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure, but it's bounded by human creativity, and there's anyway a difference between reliably and deterministically computing the function, and a sort of approximation with repeated guesses, or reaching a certain confidence.

I only have vague recollections of numerical analysis, I remembered the Newton-Raphson method (no good: we don't have an oracle for f'(x), just f(x)) and stumbled into the Runge-Kutta Wikipedia page which I'd forgotten about (also no good).

But it seems like a good strategy would be to test one number, guess assuming constant, test a second, guess assuming linear, test a third, guess assuming first order polynomial, and so on.

In the presence of step changes or the like though I suppose there's nothing you can do.



> But it seems like a good strategy would be to test one number, guess assuming constant, test a second, guess assuming linear, test a third, guess assuming first order polynomial, and so on.

You are still implicitly assuming that the function is continuous. There are a lot of nowhere-continuous functions, e.g. the sawtooth function. Or more interestingly, Conway's base13 function, which takes on every real number in every interval.

Also what about continuous non-polynomials? e.g. exponentials, logarithms, sine/cosine etc.?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: