# Vulcanic Values

I’ve been playing with this idea of quantifying a value system consistent with our intuitive sense of good and evil as well as right and wrong. The exact nature of this concept has many instantiations and is historically philosophical and political controversial. So, we resort to some assumptions by defining $u$ as the most rightful utility functor, god knows what it is, and something we should seek to maximize. But in reality, in the domain of our minds, including the present blogging effort, as well as computer minds, we evaluate a different function known as a value function $v$. It is how much we value something in our minds. In analogy, say we’re autonomous robots, this value function is something we are able to evaluate and seek to optimize as part of our self-determined program. We are generally hopeful that $v=u$ to our best ability. In practice, in complex decision making systems like human society, there may even be other functions that further approximate $v$ such as laws and rules. The function $v$ and it’s surrogate approximates may be implemented as human brain, or a jury, or an arbitrator, or democracy. Let’s call a surrogate $r$.

In Star Trek film, it is revealed that an alien race called the Vulcans have an idea that “the needs of many out weigh the needs of the one or the few.” In particular, it is invoked whem Spock sacrafices his life for “many” life’s of his crewmate. Admittedly it is unclear if it is applicable universally or only in existential situations. (The latter is a circumstance situated in a territory of incomparable $v$‘s imho, but I digress) There is an implicit conversion from our believe about needs to our believe about utility, let’s assume it occurs according to our intuition.

Therefore, Vulcan Logic places constraint on the value function. One ould expect that it has the form:

$v(X) = u(X) + w(X)$

Requiring

$w(A) > w(B)$ when $|A| >|B|$

$X, A, B$ are sets of objects of value evaluation (presumably equal objects). $w()$ is a weighing function that places weights in the mind domain in addition to true utility. One would guess that this is done to compensate for our lack of comprehension for the true $u$ that the above $v$ is actually:

$v(X) = r(X) + w(X)$

Where we suggest that the mind domain function $r$ is the best surrogate we have for $v$

$r = v$

But not perfectly so, we think. To compensate for our uncertainty, we add the constrained weighing function because, according to Vulcan Logic,

$|r(X) + w(X) - u(X)| < |r(X) + w'(X) -u(X)|$

For any weighing function pairs $w, w'$ where $w$ satisfies the greater needs constraint and $w'$ does not satisfy it.

Considering a very related weighing scheme, with multiplicative weights:

$r(X) * w(X)$

Inspires a stranger concept of out-weighing. If our $r(X)=\sum_{x\in X}r(x)$, that is, declaring the utility of the whole is merely the sum of the utilities of its constituent parts. The weighing would become $v(X)=\sum_{x\in X} r(x)w(x)$. The corresponding requirement of weight is therefore $\sum_{x \in X}w(x) > \sum_{y \in Y}w(y) \forall{|X|>|Y|}$. The idempotent version where $w=1$ would satisfy the Vulcanic constraint. Notice, also, this arrangement, as before, does not require that the most or least populous group utility is most optimized, only that they are considered more heavily considering popularity.

Cool! We have taken a few first step towards codifying xenoethics in earth mathematics.