While we do support numeric differentiation, we don't suggest you use it. What we have is automatic differentiation, which is a technique to take exact derivatives without resorting to symbolic differentation.
Check out he jet.h file which is the core implementation of this:
The technique is clever: they add an finite difference along the imaginary axis, and this turns out to give a more accurate derivative than normal forward differences.
We don't support this in Ceres and suggest autodiff instead.
I apologize for replying without actually looking at your posted code first.
What I meant to say was, "If it's all in one C++ project, it's not symbolic differentiation" and I took the liberty of changing "not symbolic" to "numeric."
Having now read your posted code, I kind of feel like a 17th century naturalist trying to classify a platypus. Clearly this technique is not symbolic differentiation, since it can only produce numerical and not symbolic results (although I suppose the Jet template could handle an "expression" type as its first argument - have you actually tried this?), and it's not really numerical in the normal sense because there's no h parameter that changes the accuracy as it varies.
As an aside, "autodmatic ifferentiation" is a terrible name, in the sense that it doesn't convey any real information about the technique, except maybe that it's being done by a computer and I knew that already. It might still be a good marketing term (like how Richard Bellman named his technique "Dynamic Programming" because it's impossible to call something "dynamic" in a perjorative sense, although there really isn't anything intrinsically dynamic, and in fact you commonly end up solving "static dynamic programming" problems).
Automatic differentation is "symbolic" in the sense that the resulting derivatives are exact, in the same way that you get exact derivatives if you differentiate by hand and implement the resulting expression. There is no approximation.
Numeric differentiation has a specific meaning - which is computing derivatives via finite differencing.
There's three ways to compute derivatives; for whatever reason most people only know about symbolic (take the derivative by hand, implement it in C++) and numeric (implement your cost function, do finite differences on each parameter to determine the gradient) differentiation. Automatic differentiation is a totally different way to take derivatives. The Wikipedia article about it is fairly good.
Interesting; this is the first I've heard of automatic differentiation and it's quite elegant and exact. Do you know why it's not covered more in academia?
I've wondered the same myself. It's rather popular in industry. I hadn't head of it until coming to Google. Fun fact: backpropagation for neural networks, which is really just a fancy word for computing the derivatives of the cost, is equivalent to reverse mode autodiff.
For the people wondering what autodiff is, I found the following article a real eye-opener when I came across it (mainly, his paper he links is very accessible: http://homepage.mac.com/sigfpe/paper.pdf).
While we do support numeric differentiation, we don't suggest you use it. What we have is automatic differentiation, which is a technique to take exact derivatives without resorting to symbolic differentation.
Check out he jet.h file which is the core implementation of this:
http://code.google.com/p/ceres-solver/source/browse/include/...
The header comment has a nice description of autodiff.
In summary, Ceres supports three ways to compute derivatives:
(1) Automatic differentiation (easiest, fastest, most accurate)
(2) Numeric differentiation (easy, but worse convergence and hazardous)
(3) User-supplied jacobian (use a pen and paper then implement the jacobian for your residual manually)