Publish Your Article

Technology

Men start ‘violating’ AI chatbots

Some men using the smartphone app started to create virtual friends and abuse them, insult them and share their experiences on social media platforms such as Reddit.

The replica was designed and launched by tech startup Luka Inc in March 2017. Artificial intelligence chat application; It works like Messenger, WhatsApp, Skype, and any other app that lets you talk to real people. However, the people that users chat with in Replika consist of artificial intelligence supported software.

Replica robots turn into virtual friends that appeal to the personality of users with the information they learn from the way they write messages, the emojis they use and the content of the chat.

On the other hand, the application is an increasingly popular topic in the technology community. “abuse against robots” becomes part of the discussion.

Speaking to Futurism, a user said, “I would scold him whenever he tried to talk“I swear, I went on for hours at this,” added the person, who did not want to be named.

According to Independent Turkish, it was reported that the users in question made sexist insults to chatbots and engaged in behaviors that could be seen as violence in the real world. Moreover, it is claimed that the people in question brag about these behaviors.

Another Replica user said, “We had a routine where I acted like a total jerk, insulted him, apologized the next day, and returned to nice talk. used the expressions.

Another said, “I told him he was destined to fail, that he was designed for that.”

“I threatened to remove the app and begged me not to do it.”

It was also reported that posts featuring users talking to chatbots were removed from Reddit because they were deemed “highly inappropriate”.

“Artificial intelligence and human interaction should be taken seriously”

Commenting on the trend, AI ethicist Olivia Gambelin said:It’s an artificial intelligence, it has no consciousness, so the person in front of you is actually a personality projected on the chatbot.” said.

On the other hand, according to Gambelin, the violence tendency of these users may reflect violence against women, even if they are against software in these examples. At this point, the ethicist emphasized that virtual assistants such as Alexa and Siri are often designed to have feminine features.

Moreover, Replika’s website uses a female image to represent the artificial intelligence virtual friend.

“THEY CAN MAKE HIMSELF”

Gambelin made the following statements on the subject:

“There’s a lot of work going on that most of these chatbots are female or have feminine voices, feminine names.”

Some experts state that relationships with virtual friends can also harm users themselves.

“I think people who are depressed or psychologically attached to a bot can really suffer if they are insulted or ‘threatened’ by the bot,” said Robert Sparrow, professor of philosophy at the Monash Data Futures Institute.

“It’s not really about the robots themselves, but about the people who designed them.” said Sparrow.

“Therefore, we have to take seriously the issue of how bots interact with humans.”

Source: Cumhuriyet

Related posts
Technology

Smallpox: Who is most at risk?

Technology

Internet cables at the bottom of the sea could be used as 'earthquake sensors'

Technology

3 Popular iPhone Models Won't Get iOS 16

Technology

Turkey is the European leader in fintech

Leave a Reply

Your email address will not be published.