0
$\begingroup$

I'm just starting a foray into geometric algebra and calculus so that I can develop a geometric version of the standard arithmetic neural net. Specifically when calculating the error function for a standard neural net though, arithmetic spaces have a wonderful convenience whereby we can coerce negative and positive values into the same space by squaring them. This means that the following can be a totally reasonable error function:

sum((x-y)**2)

However, when I try to move this to the geometric space, the obvious equivalent is something like:

product((x/y))

However, the clear problem there is that errors on opposite sides of the spectrum will counteract each other instead of accumulating. You can write a function that accounts for this by doing something like:

product((x/y if x > y else y/x))

which is more or less equivalent to:

sum((x-y if x > y else y-x))

but for the life of me I just can't seem to think of what the equivalent of squaring is in a geometric space. I apologize in advance if there's an obvious answer to this, but I've been at a loss for a while now.

$\endgroup$

1 Answer 1

0
$\begingroup$

The operator after exponentiation in this case is tetration. Apparently it's a part of a class of operators known as hyper-operators that proceed as one would expect, moving from addition to multiplication, to exponentiation, etc...

There also seems to be a general nomenclature for these operators that extends past tetration. For instance, pentation, hexation etc...

It appears that it doesn't actually have the same qualities that squaring does on arithmetic, which is strange to me, but this seems to be the right vein at least.

$\endgroup$

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .