One wonders if there are thoughts that people cannot have for those people who did have them did not continue on to be able to pass the thought onto others?
Just like a crashed computer cannot infect another computer with the virus that crashed it, could there be some group of thoughts inducible in normal humans that cause catastrophic problems so severe that those accidentally reaching it in the course of evolution actually never lived to tell?
Yoshua Bengio’s talk at the 2016 Bay Area deep learning school inspired this nightmare last night… What if just as we don’t need to drive a car and crash a thousand times to learn how not to crash a car, we have learned not to think of many things that would crash our own programming? Other somewhat subliminal messages that caused this nightmare was his computer crashing three or four times during the presentation.
What if the reason why we don’t know how human brain learn is because that knowledge would cause instability in human individual or society–to crash them–so severe that we have built-in mechanisms that prevents us from understanding it? The same way it is easy for us to understand physics and logic and emotions, etc, etc, through same unknown mechanism, we are inhibited from having some class of knowledge or skill?
The obvious example would be for example if I derive a deterministic way to cause a person to stop peeing, or rather, if I learned to stop myself from peeing. That is an obvious example. But there could be other less obvious examples–things that cause psychological changes, inducing forgetfulness, ahh for example if I figured out how to forget and then immediately forgot how to do so. This kind of “bug” or boundary conditions seems very possible.
Aside from admitting that it is structurally possible, there is the added effect of evolution. If true, evolution theory tells us that those are precluded from happening strictly because those of us survives have evolved away these dangerous edge cases or otherwise developed very strong and redundant inhibitory systems to prevent them from occuring.
Lastly, is knowledge of how we learn such a dangerous knowledge? Do we constantly have mini-crashes, just as we are about to learn the secrets of knowledge, the learning of it, something peripheral, like a presentation software, crashes in us and prevents that thought from occuring?
Thankfully we know that we have not evolved away the ability to think about such possibility, perhaps there is still hope?