Equality of Benefit

I’ve been involved in a lot of discussion around bias, equality and fairness regarding algorithmic decision making. Without going into excessive amount of background and detail the gist of my believe at the current moment is that equality of utility is the safest thing for companies to aspire to.

What is equality of utility? Let’s degenerate into binary decision making: given individual x, who has observable features f(x) and protected feature p(x). Suppose the company has to choose among two actions to take {a,b}. What is a workable definite of fairness or equality in such a decision making effort with respect to protected properties p?

Let god bestow us, a neutral third party, with a utility functor u whose evaluation on the individual u(x) results in a function u(x)(a) is the utility of company taking action a to individual x, u(x)(b) is the utility to individual x of company taking action b.

Let g be the decision process of company, g(•) is the decision company makes either a or b for the situation. Then the right thing to do

g(f(x)) = argmax_{i\in{a,b}}(u(x)(i)) = g(f(x), p(x))

Simple, we do as god says is best for the customer, act as if we have the knowledge of an oracle–even when we know of some reason for discrimination.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s