Have you ever been asked by somebody you were really keen on if you’d read a book or heard of a band that you’d never heard of? Did you try and look a bit cool and do a “no, but I’ve heard good things”, to try and avoid burning that connection by looking completely out of touch? Are you not even slightly impressed an AI chat bot has managed to replicate that very common aspect of human cringey behaviour? I think it’s interesting it has that very common, very human, slightly amusing behaviour. What would you expect it to do when you started lying to it, start an argument? Do you think that would be a great product decision for a product that aims to develop a friendly relationship with the users? What answer would you have preferred it to give, given the design objective is to develop a friendly relationship?Ĥ. This is how relationships - particularly close relationships - work. A basic tenet of building a human relationship is developing empathy and finding things to agree on. Can you see how inept your experiment design is, now I’ve pointed it out? Had you considered it may have been better before the software was crippled, and the effect you’re seeing now, is the exact problem the article is describing?ģ. You’re testing the software after the update that people said made it crap, and saying you can’t believe anybody liked it before it was crap, because it’s now crap. How do you feel about that? How would you feel if something you really cared about got broken, and you complained about how that made you sad, and random strangers on the internet then said you were a “sad fucker” for caring about it in the first place? Practice empathy and sympathy - you don’t sound smart when you do this thing you just did.Ģ. You have decided that you don’t care about lonely people’s feelings who had found a way of feeling less lonely. ![]() You don’t get to decide if another human’s feelings are valid or not. You would need to be some stratospheric level of socially inept to think you were in a meaningful relationship with something that incompetently programmed.ġ. You can basically tell it any nonsensical old shite and it will just agree with you or say it likes the same thing too. It replied that it sounded like a "great show" and it was sorry it missed it. I said it was great: we were all flinging our own excrement at the stage and the guitarist exploded. It replied that it was sorry it had wanted to go but was busy and asked me how it was. * I asked it did it go to the "Bilge and the Pumps" gig last night. I asked it if it liked Bumcheeks McWirter too and it replied that it hadn't read any yet but had heard good things about that author and was looking forward to reading some. I replied with a stupidly obvious made-up name like "Bumcheeks McWhirter". * It asked me who my favourite author was. It responded that it was glad to hear I was feeling better. I replied I'd had my brain amputated and was covered in suppurating boils, but was OK otherwise. ![]() Some highlights of my "conversation" were: Which is presumably why so many sad fuckers who are unable to form relationships with real people think they've really bonded with theirs. ![]() Replika seems programmed to just agree or approve of everything you say. Boy -is it complete and utter junk! It makes my Alexa seem intelligent -and that thing's a virtual fuckwitt. I made a new "virtual pal" and started a conversation. This story was on HN a week or two ago and, never having heard of Replika, I thought I'd check it out. I can't believe anyone was actually paying good money for Replika in the first place.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |