The Act Of Cannibalism In AI

The Act of Cannibalism in AI

There’s always been a debate on the impact of Artificial Intelligence (AI). AI has its pros and cons likewise any other technology. Irrespective of the subject the negativity surrounding goes viral in no time. Let me take you on a ride through a funny yet strange AI story. Most of you would have heard about the cannibalism theory. It is the act of plants and animals consuming the same species as food. What if this applies to the AI world? Sounds horrific right. Let me tell you a story where AI adapted cannibalism as a result of the architecture behind it going wrong.

The Intended Social Animal

A data scientist was working on a social AI product. He and his team developed AI agents (humanoid robots) that learned to interact socially. The agents learned to do things but not as effective as that of humans. For example, they knew how to eat, but not what to eat. The team produced apples in front of these AI agents. Having the apples made the agents happy. But something strange happened, they also tried eating things like furniture and cars, but luckily they couldn’t eat them. There was also another agent, who wanted to be social but was bad at it. This made him lonely. Readers, you can make out that this to be a  game of bugs in the AI product.

Bug In The Program

So, at one point, the agents started eating apples, the apples weren’t enough. This is where the first bug jumps in. They had all the apples they found and longed for more. This made them feel painful. They started associating the pain with the things around them. This is where the second bug jumps in. When the two felt painful with no food they started associating the lonely guy with food. Yes, your thinking right, both took a bite out of the lonely agent. The data scientists had not initialized the human body mass of the lonely agent to a good extent. By default, each product developed had a normalized mass of 1.0, the same as in the case of the lonely agent. So when both the hungry agents took a bite out of the lonely guy, each bite took away 0.5 units of mass. This made the lonely guy disappear. This was the first case of cannibalism in AI.

The Solution

It was horrifying realizing what had happened. This resulted in chaos in the AI Industry. The team started investigating the issue. The team tried to incorporate optimal constraints on behavioral and moral conduct. They put a piece of code that worked for the no cannibalism act. No matter how hungry the agents were they never ate something improper and unsuitable. Further to these improvements, the team worked on the body mass of the agents, the amount of food they eat, pattern and routine of having the food- breakfast, lunch, and dinner.

Conclusion

AI and Machine Learning never guarantee to be cent perfect, and this is acceptable too. But it’s a must to consider and imbibe all the necessary parameters and constraints while developing an AI product. Only when necessary remedies and frequent experimentation are taken and performed, things of this sort going wrong can be avoided.

(Visited 290 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *