• 2 Posts
  • 41 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle







  • They’ve made it clear they won’t. And since most subs are only going dark for a measly 48 hours they have no incentive to. It’s literally like that “Oh no, anyway…” meme.

    And think of it this way: even if you they revert the changes (or you just decide to please /u/spez and only use the official app) do you think the platform will continue to get better or worse? He’s shown his hand it’s nothing good for the mods or the users.












  • As long as it’s approached as a brain training assistant (or some other market-y buzzwords) and it’s being used with giant disclaimers I’m totally for using it with old, lonely people. It might not be a perfect aid but it can help in certain situations. Knowing how our society works, however, Big Company A is going to perfect the tech, make people dependent on it, and then scam them.

    Considering how popular Farmville, as a game, was on Facebook I shudder to think about what a finely tuned AI, made by a for-profit company, will be capable of doing with old, lonely people.

    Do you really want your mom chatting with a for-profit AI (like Google Bard) about you or your family to feel less lonely? I’d sooner let my brain rot, but to each their own.


  • I think it’s dangerous to try to cure loneliness with an AI, regardless of sophistication and tuning, because you end up with human who’s been essentially deceived into feeling better. Not only that, but they’re going to eventually develop strong emotional attachments to the AI itself. And with capitalism as the driving force of society here in the U.S. I can guarantee you every abusive, unethical practice will become normalized surrounding these AI’s too.

    I can see it now: “If you cancel your $1,000/a year CompanionGPT we can’t be held responsible for what happens to your poor, lonely grandma…” Or it will be even more direct and say the old, lonely person: “Pay $2,500 or we will switch of ‘Emotional Support’ module on your AI. We accept PayPal.”

    Saying AI’s like this will be normalized doesn’t mean it’s an ethical thing to do. Medical exploitation is already normalized in the US. Not only is this dystopian, it’s downright unconscionable, in my opinion.