AI if unregulated could be a lot more worse than social media. My kid after his first chat with AI said, he is my new friend. I was alarmed and explain him but what about the parents and guardians who are unaware how the kids are befriending the AI. Part of the problem is also how it is trained to be nice and encouraging. I am sure there are researchers who are talking about it but the question is are the policy makers listening to them ?
With the current acceleration of technology this is a repeating pattern. The new thing popular with kids is not understood by the parents before it is too late.
It kind of happened for me with online games. They were a new thing, and no one knew to what degree they could be addicting and life damaging. As a result I am probably over protective of my own kids when it comes to anything related to games.
We are already seeing many of the effects of the social media generation and I am not looking forward to what is going to happen to the AI natives whose guardians are ill-prepared to guide them. In the end, society will likely come to grips with it, but the test subjects will pay a heavy price.
I tried the new battlefield game and it’s bizarre how some of my friends play it. There’s this expansive battle pass (pay $20/quarter for access, on top of $70 base game) where you’re given tasks in game to make progress towards cosmetics. My friends only play to complete these tasks - the actual combat gameplay loop almost feels secondary to their enjoyment of the game. The modern trend of games becoming chore simulators is worrisome to me because -call me old fashioned- I believe the core gameplay loop should be fun and addictive, not the progression scaffolded on, even if that gameplay loops is a little “distasteful” like GTA.
Violent video games might not have impacted society, but what about addictive social media and addictive online games?
What about the algorithm feeding highly polarized content to folks? It's the new "lead in the air and water" of our generation.
What about green text bubble peer pressure? Fortnite and Roblox FOMO? The billion anime Gatcha games that are exceedingly popular? Whale hunting? Kids are being bullied and industrially engineered into spending money they shouldn't.
Raising kids on iPads, shortened attention spans, social media induced depression and suicide, lack of socialization, inattention in schools, ...
Social media leading people to believe everyone is having more fun than them, is better looking than them, that society is the source of their problems, ...
Now the creepy AI sex bots are replacing real friends.
To be fair,you said it yourself: the problem are ONLINE games, why did you generalized that to all videogames?
I'm with you that those are addicting in a bad way, I was there too. But single player games have no incentive in entertaining you forever to generate more money.
I have no problems with my kids playing single or local coop games.
“Regulate”? We can’t, and shouldn’t, regulate everything. Policymakers should focus on creating rules that ensure data safety, but if someone over 18 years wants to marry a chatbot… well, that’s their (stupid) choice.
Instead of trying to control everything, policymakers should educate people about how these chatbots work and how to keep their data safe. After all, not everyone who played Doom in the ’90s became a real killer, or assaults women because of YouPorn.
Society will adapt to these ridiculous new situations…what truly matters is people’s awareness and understanding.
We can, and we should, regulate some things. AI has, quite suddenly, built up billions of dollars worth of infrastructure and become pervasive in people's daily lives. Part of how society adapts to ridiculous new situations is through regulations.
I'm not proposing anything specifically, but the implication that this field should not be regulated is just foolish.
People think that we can just magically regulate everything. It's like a medieval peasant who doesn't understand chemistry/physics/etc thinking they can just pray harder to have better odds of something.
We literally CAN'T regulate some things for any reasonable definition of "can't" or "regulate". Our society is either not rich enough or not organized in a way to actually do it in any useful capacity and not make the problem worse.
I'm not saying AI chatbots are one of those things, but people toss around the idea of regulation way too casually and AI chatbots are way less cut and dry than bad food or toxic waste or whatever other extreme anyone wants to misleadingly project down into the long tail of weird stuff with limited upside and potential for unintended consequences elsewhere.
All your argument consists of is, "Somebody somewhere believes something untrue, and people don't use enough precision in their speach, so I am recommending we don't do anything regulatory about this problem."
Having a virtual girlfriend is not selling toxic yoghurts, it doesn’t harm anyone, it’s like if you buy yoghurt and put in on a pizza… you can do want you want with the yoghurt like with the AI.
The important thing is keep the data safe, like the yoghurt that must not be expired when sold.
Despite what the free market religion has been telling for decades, we actually don't live in little parallel universes that don't affect each other. Even putting yoghurt on pizza has on effect on the world, not just the individual doing it. Not understanding this is what'll be the end of humanity. AI girl/boyfriends will have a huge effect on society, we should think hard before doing things like that. Slightly slower technological progress is not as disastrous as fast progress gone wrong.
We also don't want to regulate everything. Have you seen that someplace, or even here? Or it's an imaginary argument? The topic was regulating AI, and about that I like your thought: humans should be better educated and better informed. Should we, maybe, make a regulation to ensure that?
I understand what you’re saying but it’s a difficult balance. Not saying everything needs to be regulated and not saying we should be full blown neoliberalism. But think of some of “social” laws we have today (in the US). No child marriages, no child labor, no smoking before 19, and no drinking before 21. These laws are in place because we understand that those who can exploit will do the exploiting. Those who can be exploited will be exploited. That being said, I don’t agree with any of the age verification policies here with adult material. Honestly not sure what the happy medium is.
I already wrote “over 18”. AI is already regulated, you can’t use it if you’re under 14/18. But if you want to ask ChatGPT “what’s the meaning of everything” or “can we have digital children”, that’s a personal choice.
People are weird… for someone who is totally alone, having a virtual wife/child could be better than being completely alone.
They’re not using ChatGPT to do anything illegal, and already regulated, like planning to kill someone or commit theft.
I'm of the opinion that should be unregulated as well. Just like you say, what's important is people's awareness and understanding of the tool and how it works underneath.
There is a massive difference between a stuffed animal and an LLM. In fact, they have next to nothing in common. And as such, yes any reasonable parent would react differently to a close friendship suddenly formed with any online service.