My stance on the Chinese Room is that it’s a stupid thought experiment because the moment you move outside the constraints of this extremely specific scenario you’ve crafted, obviously the guy doesn’t speak Chinese.
It’s between the guy outside the room and whoever made the rule set in the first place. And it’s a conversation that the guy who made the rule set predicted 100% in advance because, I guess, he’s precognitive.
the point of the chinese room is that it may be impossible to tell the difference between an interlocutor with internal experience and one without, with the harder argument being that if you can't tell the difference it isn't actually a relevant distinction
This is exactly why the Chinese Room is stupid. He has a list of rules that allows him to produce a perfectly cogent conversation using only the text that is presented to him. No matter what input he's provided, he responds in a way indistinguishable from a real person. But only part of the conversation is in the words on the page!
Context! Cultural signifiers! Facts about the universe, which may change over time! These are also an inherent part of the conversation. You cannot produce the Chinese Room's rules unless you know ahead of time who is going to be querying it, because the same piece of provided text may mean very different things to different people
Everyone in the world has different context and cultural signifiers than I do, and I talk to them and the words I say don't always mean the same things to me that they do to them
Practical example: I put in the question 'what color is the sky' into the Chinese room. It responds 'blue'. Okay, cool. A year later, and in the intervening time sme catastrophic event occurs which results in a change in the atmosphere's refractive index, and the sky is now magenta.
Another person comes up to the Chinese room and asks the same question. This time the answer is obviously wrong and the illusion is broken. You can't have one ruleset that correctly answers both questions.
It may guess a suitable timeframe and it may guess correctly but it also may not and the premise of the thought experiment requires that its imitation be infallible, not probabilistic
counter 3: upgrade the room to give the roomguy system a set of eyes. there's a second slot that inputs new information and rules for the bookshelf, and the guy memorizes them as they come in.
honestly I don't agree with your dismissal of this whole experiment but I don't have the same level of emotional energy to bring to it, which is always a state that I don't know what to do with
(Is it useful as a thought experiment, like is being argued against? Mu. Thought experiments and honestly all of philosophy aren't "useful" because they never have actual 'solutions', just answers that tell you about what the people being asked are like.)
Doesn't it? What honestly is the difference between a human outside of a solipsistic experience and a computer? You have to trust that the experiences they state they are having - which are not yours - are actually being had.
you can trust that other people with traits similar to your own likewise have sapience similar to your own through association, whereas my angle was much more "but that's still just trust, nothing you can prove"
If a person I had known online for 12 years revealed that they were a hyper-advanced chatbot, and they somehow could prove to me that that was true, then I wouldn't stop thinking of them as a person
Then again, we... mm. Wow, that's a can of worms I'm afraid to open, but maybe there was some merit to PikaBot's defiance of the chinese room experiment, just not in the way anyone was approaching it:
Maybe the problem is that we can only ever see the RULESET so we have no idea what is there except the RULESET. There's no thought, just unflinching execution of rules.
That there is part of the question, isn't it? Part of the experience, of "free will" and sapience, is being able to defy SOME rules with other rules. Being able to prioritize conflicting statements without reaching an unresolvable impasse.
BWAAAAAH!
: the fun(???) part there is: I have multiple friends, both plural and singlet, who would disagree on the answer they gave to that on MANY levels! It's another of those questions that doesn't have a 'correct' answer and lived experience changes it so much
hell, the difference in my mind between 'plural alters' and 'RP muses' was nonexistent until I started talking to other plurals and even then in my head they're, kind of the same thing?
like even jokingly but partially because a lot of mental illness does really feel like SOMETHING is just slapping your hands away from the controls for reasons known only to itself
moontouched
: hell, not even just mental illness but physiological brain issues; both me and my SO have likened her seizures to windows bluescreens before
like your body sometimes really does feel like this fucking unruly meat machine that just does whatever it wants, even when you don't want it to happen
The question ultimately becomes "how much are the flesh and meat and electric brain impulses tantamount to a soul".