I'm just starting a foray into geometric algebra and calculus so that I can develop a geometric version of the standard arithmetic neural net. Specifically when calculating the error function for a standard neural net though, arithmetic spaces have a wonderful convenience whereby we can coerce negative and positive values into the same space by squaring them. This means that the following can be a totally reasonable error function:
sum((x-y)**2)
However, when I try to move this to the geometric space, the obvious equivalent is something like:
product((x/y))
However, the clear problem there is that errors on opposite sides of the spectrum will counteract each other instead of accumulating. You can write a function that accounts for this by doing something like:
product((x/y if x > y else y/x))
which is more or less equivalent to:
sum((x-y if x > y else y-x))
but for the life of me I just can't seem to think of what the equivalent of squaring is in a geometric space. I apologize in advance if there's an obvious answer to this, but I've been at a loss for a while now.