(No idea if you'll ever see this, I also have no damned idea what I'm talking about.)
Looking at the Weierstrass function, it's a chaotically convergent series with a non-convergent gradient series.
(The standard differential forms don't apply, because the assumption of linearisation at the scale of 𝛿x is invalid, due to the scale invariant properties of the fractal.)
In that case the integral function would be even more convergent since the 'noise' (pretty much) adds to zero.
So can you differentiate via integration, and get a gradient for a related function (with less noise)?
I messed around with this, taking the areas of a pair of little triangle approximations and adding them to get a ~rectangle.
Then divide that area by 𝛿x to get 𝛿y, and divide that by 𝛿x to get 𝛿y/𝛿x.
This was the form that I reached:
let y = f(x)
let F(x) = ∫f(x)𝛿x
𝛿y/𝛿x = (F(x + 𝛿x) + F(x - 𝛿x) - 2 * F(x))/((𝛿x) ^ 2)
At very least, it passed my polynomial sanity check:
That's awesome. Glad to see your method worked. If you ever have a chance to encounter a serious mathematician, they can probably give you the proper name of it. Or perhaps we'll have to name it the Yarg integral :-)
Ah yes, this is why epsilon squared equals 0 in the dual numbers -- to automatically remove all higher order derivatives when they appear. Exponents on derivatives are indicate of their rank/order.
> The limit is called the second symmetric derivative. Note that the second symmetric derivative may exist even when the (usual) second derivative does not.
> This limit can be viewed as a continuous version of the second difference for sequences.
So it actually can do what I intended it to do, nothing original - but there was a degree of satisfaction in deriving it myself.
Looking at the Weierstrass function, it's a chaotically convergent series with a non-convergent gradient series.
(The standard differential forms don't apply, because the assumption of linearisation at the scale of 𝛿x is invalid, due to the scale invariant properties of the fractal.)
In that case the integral function would be even more convergent since the 'noise' (pretty much) adds to zero.
So can you differentiate via integration, and get a gradient for a related function (with less noise)?
I messed around with this, taking the areas of a pair of little triangle approximations and adding them to get a ~rectangle.
Then divide that area by 𝛿x to get 𝛿y, and divide that by 𝛿x to get 𝛿y/𝛿x.
This was the form that I reached:
At very least, it passed my polynomial sanity check: