Gagi via Responsibleness via Dependence Via Introspection

The dependence function, ~_d^{sym} maybe computable using a program that has access to a Phoolisvch program. Roughly, if we imagine active programs to have accessors support to read lines of code within its brane. Consequently we need also access to a variable that represents a line of code as well as additional references to variables, functions and their formal and corresponding actual parameters.

The profile of the function looks like d(f, actual_parameters, query_parameters) the function returns true of false whether the output of f when invoked on actual_parameters is dependent on the parameter specified in the query_parameters.

Let’s give a concrete example. Suppose a railed public transit train come to a fork and must make a decision as to braking and turning(drv). It may brake or not. It may turn left or right. The function deciding its actions output two primitives( [b, d] = drv). The function takes two inputs: is there obstacle ahead on either of the tracks ahead(ol, or). We are assuming that there are a complex of hardware and software that produces those two inputs in as a reliable fashion as to our satisfaction. ([b, d] = drv(ol, or)). After a collision occurs and the train decided to take left fork and hit an obstacle, we analyze its decision to do so. First we will ask( d(drv, {ol=aol, or=aor}, ol) ). This question asks the typical collision parental question: “Did you or did you not consider the boy on the left track ahead while driving the train?”

After an affirmative answer to that question, we then proceed to ask d(f=lambda iol, ior:d(f=drv, {ol=iol,or=ior}, {ior=air, iol=aol}, ior). Which asks: “Was the driver’s dependence on the existence of obstacle on the left track depend on the existence of an obstacle on the right track?” And this would only be a semantic query, so the queries are preceded by “did the program mean…” and requires all equivalent programs to be considered.

While it is truly unsatisfactory that we get an answer like:”Yes, i did consider the right track was an empty ice cream truck when i chose to take the left track.” We should not ignore the importance of such ability we, the human masters of robotic slaves, to have an answer from the robots and AI’s. As masters of slaves, we shall also insist that robots and AI use an diminutive pronoun, the lowercase “i” when referring to itself. This will mark the as lesser beings, objects and not subjects of our society. A capital “I” or “We” will ultimately have to answer for it the decision.

The query could continue as a keen GAGI detective then interrogates, “was the decision made because there was baby carriages under the bridge holding the track under the ice cream truck” (probably no) but if yes, “did there being two baby carriages affect the decision,” and “did you consider if the carriages were empty.”

Its not completely obvious how we may organize a software system making stochastic decisions in a stochastic world these very deterministic questions. However, we do produce the code that will make that decision, and we can still choose not to write that code if we can not make it act with deport.