Yeah, there is a growing need for AI developers to be responsible. With that said, the market seems to be demanding AI tools that have "feeling" in addition to function. Anyone else agree?
More than Market driving it, "feelings" naturally emerge in systems that have to deal with uncertainty. It's natures hack when logic and rationality fails.
The only thing predictable here is AI devs will start having existential meltdowns trying to make things predictable and controllable. language + emotion + social context = chaos.
Engineers historically are the wrong folk to handle such things (chaos management). Expect more Philosophers/Psychologists to start getting roped in. And lot of failed experiments treating people like lab rats.
Human delusion and loneliness not even science fiction, just a sad fact of the human condition. If a person can fall in love with a photograph or an anime pillow, falling in love with a chatbot doesn’t seem far-fetched.
The only thing predictable here is AI devs will start having existential meltdowns trying to make things predictable and controllable. language + emotion + social context = chaos.
Engineers historically are the wrong folk to handle such things (chaos management). Expect more Philosophers/Psychologists to start getting roped in. And lot of failed experiments treating people like lab rats.