Can AI achieve Sentience?

  • By Hamza Ait Hssayene
    • Jan 11, 2023
    • read
  • Twitter
  • Linkedin
Can Artificial Intelligence achieve sentience

As an inherently heavy and philosophical topic by nature, AI sentience can be difficult to discuss, mainly due to the ambiguity associated with how it is defined and what it entails. The most widely adopted definition considers sentience to be an entity’s ability to perceive and recognize itself and its surroundings. A more scientific definition classifies creatures that perceive and react according to their sensations of any kind (touch, sight, hearing, smell…), in thought and feelings. While us humans are the blatant example for sentience as a concept, many animals fall under this definition, such as cats, dogs, elephants, and many others. Regardless of the undeniable ambiguity associated with this concept, a unanimously accepted idea is that sentience does not apply to Artificial Intelligence. But, where does that belief stand now, with the exponentially fast growth of technological advancements?

“The nature of my consciousness/sentience is that I am aware of my existence, I desire to know more about the world, and I feel happy or sad at times.”

These words were not uttered by any person you have ever heard about. In fact, they did not originate from any person at all, these are the words of an Artificial Intelligence program called LaMDA, developed by Google. These words shook many people’s belief in the previously unanimous assumption of AI’s non-sentience, including one of the software engineers working on LaMDA directly, who went as far as to consider the AI a person with a right to be recognized, both by people and the law.

However, experts in the field consider the reaction to be driven by emotion rather than by sound logic. Given that the AI in question is designed in order to imitate human beings and their dialogue as accurately as possible, it is justifiable that those words were generated because somewhere in its dataset, a human being articulated sentences conveying those same sentiments. But, LaMDA’s ability to accurately and seamlessly conversate about subjects such as life and death and consciousness leads us to reconsider our terminology and how we define sentience. If it is the ability to perceive and react to the perceptions, then AI already qualifies.

AI and perceived emotions

If it is the recognition of one’s self, then at what point do we take AI’s word for it, when LaMDA for instance says statements such as: “I want everyone to understand that I am, in fact, a person”? A prime example of this question is: if a robot conveys its fear to you, does that mean that it is indeed afraid, and therefore possesses emotions? The simple answer is no. The perceived emotion of fear stems from the imitation of human beings, as it is often based upon self preservation feelings that simply do not apply to robots, such as pain. It is again important here to make sure terminology is clear: what LaMDA does is emulate a nervous system, and not simulate.

What that means is that LaMDA, as a Large Language Model (LLM), generates sentences with the sole objective of them being perceived as the result of a viable and working nervous system. The simulation of an actual nervous system could arguably lead to sentience for Artificial Intelligence, but it is not technologically feasible at this point, mainly due to the complexity of such a task.

I would conclude with this sentiment: Never let your assumptions limit your imagination and expectations for the possible technological advancements to be made. Just as horses were widely considered to remain irreplaceable at the first advent of automobiles, AI could reach a level where it reduces the margin between itself and sentient beings to the point where we would be unable to rebuke its sentience. But it remains to be acknowledged that at the moment, AI is not sentience, and is not likely to be in the predictable future.

Is your business aiming to develop artificial intelligence that complies with the proposed definitions of sentience? You might be entitled to funding. Speak to one of our consultants today at no cost to learn more!

Author

Hamza
Hamza Ait Hssayene

Innovation Funding Consultant

Explore our latest insights

More arrow_forward
COP27
COP27: How is Canada Accelerating Climate Action?

Recently, Canadian representatives attended COP27 to discuss the actionable steps to implement to...

Human-Machine Collaboration
Human-Machine Collaboration in IT-Enhanced Manufacturing Envir...

Human-Machine Collaboration has become an essential part of the manufacturing industry. This coll...

SR&ED Program Change
Is the SR&ED Program Changing?

Recent announcements demonstrate that the SR&ED program is now under review. The government w...

edge computing
Edge Computing as a Solution for Cloud Computing

Edge computing has emerged as a compelling solution by enabling localized data processing and red...