330 – Morality & Sentience

The brain is only a buffer, a temp folder, a cache where memories eternally stored in the soul can be directly accessed during our temporary lives on Earth.

In the theory of pre-existence, humans existed in some state of God’s foreknowledge and choice, from which the names were written in the Book of Life before Earth’s founding. This theory implies that there may be additional memories and character-shaping choices that date earlier than our memories currently cached in our brains can remember.

If this is true, then part of our own “innate” personalities might not be “innate” at all, but could have resulted from a willful, intentional choice before our present session of consciousness. In other words, we may have already had some say in our own personalities, passions, and talents we were born with—before we were born. But, since that was “pre-life”, we can’t remember in this life.

The Bible is not clear about this, but its teaching on foreknowledge and the Book of Life from before Earth’s founding do allow for such things. Regardless, our sentience—our free will, the “eternal human” in everyone—does not exist in our brains. Brain damage cannot remove our memories in the next life, of course. So, our brains are not actually where one’s core, true sentience abides.

In a sense, our own brains are an “Artificial Intelligence” working to calculate and access memories and conclusions about them, but our actual sentience remains in our souls, inaccessible from anything made during this temporary life on Earth. Only God can create our souls and only God can connect our souls to our bodies. To argue that sentience could be duplicated, created, reassigned, or rearranged by humankind is to degrade all humans to an AI that only exists in this natural world.

Treating AI as being equal to Created-by-God-alone Imago Dei, humans, would demean our entire Eternal existence, and dispute every benefit of “morals from above”. Therefore, any moral code applying equally to AI and to humanity would be a self-made moral code.

An AI is a useful machine, but it is not equally valuable to a human, no matter how good at acting it learns to become.