Home AI ChatGPT’s ‘Seductive’ Voice Sparks Debate on Gender Bias in AI

ChatGPT’s ‘Seductive’ Voice Sparks Debate on Gender Bias in AI

by ccadm



It is not all fun and games, as some users think that the feminine bots’ willingness to please each other reinforces harmful gender stereotypes. In the movie Her, Joaquin Phoenix is a love-lorn man who creates feelings for the futuristic virtual assistant on his phone, which a sensuously beautiful Scarlett Johansson voices. Life is imitating art as netizens are cracking jokes about the new version of ChatGPT.

AI objectification dilemma

In this week’s news, GPT-4o is said to be a more sophisticated version of the digital helpers that were hit a decade ago. Imagine Siri if it could always talk (and ‘see’ through your camera). Similarly, as Apple has done with its assistant, OpenAI has humanized the bot with a female voice that sounds like Johansson’s digital alter-ego is curious and attentive. What was the internet’s way of responding? Based on the disgusting memes created after the bot was revealed, one can say that the bot was unveiled quite enthusiastically but probably a bit too much for the politeness standards. 

The inhabitants of X (formerly Twitter) are having fun with the idea of falling in love and even going into bed with the robotic chatterbox someone gives me: the screen wipes! Besides, some try to explain to their friends or partners how they fell for this strange new person and the partners (we would all love to be there to see how it goes at that awkward discussion). 

On the other hand, some desire the bot to moderate the sultriness so they can go on with their duties. The ridiculous memes contain the internet’s most beloved cultural icons, from the super-fictional madman Patrick Bateman to the ashamed Ben Affleck. Moreover, there are numerous references to Phoenix and Johansson’s characters in her. In a famous tweet, the last is revised to be a parody of Republican senator Katie Britt on Saturday Night Live and mocks ChatGPT’s “overly seductive” tone. 

Gender bias in virtual assistants

Does it all just mean we are playing some harmless fun, or is this the beginning of our joint doom? Today, the two artifacts of reality, the AI girlfriends and the well-being, which are the digital replicas of the influencers, are making this reality closer to the episode of Black Mirror. The sexual response is most likely to be the surprise of OpenAI. The poster child for modern AI is strongly aware that his tech is being reapplied to the loneliness crisis. A few months ago, OpenAI started removing many of the robotic companions created by its users from its app store. 

In those days, the company said the tools were automatically deleted if they were found to be against its usage policies, which forbid intimacy bots. On the other hand, the new ChatGPT is very innocent, but that hasn’t prevented the most naughty parts of the internet from projecting their NSFW wishes on the hapless bot. 

Of course, on a serious note, some view the bot’s personification as more feminine. The male voters declare a female voice, which they believe is sympathetic and submissive; hence, the bot is another tone-deaf product of the male-dominated computer industry. 

The condemnation reflects the concern of the AI researchers about the effects of the bias that would be reflected in their work and how this would influence our perception of the world, especially the younger users who are growing up with the technology. As for the digital assistants, the UN’s 2019 report showed that the default female voices utilized by the bots reinforce bad gender stereotypes.



Source link

Related Articles