I was feeling really philosophical when I wrote this...

<p>

Ad hominem, technically (but not necessarily veering off into nasty name-calling - there’s a difference).
Who the heck taught you logic, again? (just kidding… that would be veering off into ad hominem as well).</p>

<p>Seriously, perhaps you can realize that I really enjoy dissecting people’s logic on CC (especially when there isn’t any, hehe).
Philosophy is all about logic and careful precision of the words you use.</p>

<p>

You are missing the point of the thought experiment. The thought experiment is basically “What if our brains were just in a vat and all we experience are merely electrical impulses that are run down our nerve endings? What is reality, then?”
It is only meant to provoke thought about the nature of reality.
This is not a question of utility.</p>

<p>“Mind’s eye” = Scott Pilgrim quote.</p>

<p>someone is not getting cherry soda added to his brain vat tonight …</p>

<p>

</p>

<p>The real question is what would you want to experience with your brain while its in a vat.</p>

<p>i would want to swim. i want to attach flippers to it and imbed a motor in it that i can flip on by thinking, so i can swim around in the cherry soda in figure eights.</p>

<p>A corollary to the brain-in-a-vat question is which life would you prefer? Would you rather be that brain-in-a-vat, able to experience whatever you please? Or would you rather be a human being, subject to physical constraints? Would you rather be that brain-in-a-vat, able to conjure up endless pleasure? Or would you rather be a human being, still able to experience pleasure, but not always at will? </p>

<p>To summarize, is sensory perception all that matters to you?</p>

<p>yep.</p>

<p>that is precisely why i feel i must help reduce the chance of humans going extinct. because if humans go extinct then i die too and my sensory perception ends :(. not cool, man.</p>

<p>

</p>

<p>Your sensory perception is not contingent on the survival of your peers. Unless we’re talking about a certain mutual activity.</p>

<p>i know. it’s contingent on my being alive. if an extinction event happens than i lose it. if i die by another means (like by aging or in an accident or whatevs) then i also lose it. </p>

<p>i have to ensure the continuity of my sensory perception in order to even be able to enjoy it. it may seem strange to you, but i happen to think me losing my sensory perception due to an extinction event is actually a pretty big risk probabilistically, so i am especially concerned about them.</p>

<p>

</p>

<p>Why is there a big probabilistic risk? What do you foresee? I’m interested :).</p>

<p>oh i just buy the arguments made nick bonstrom (the directory of oxford’s future for humanity institute) as well as some of the people from the singularity institute for artificial intelligence.</p>

<p>they foresee rise of smarter than human artificial intelligence as being the big threat to our survival in the near-term future above all else.</p>

<p>arguments for why this is involve discussions of the fermi paradox, the doomsday argument, the technical feasibility of such a thing happening, and trends in technology i think.</p>

<p>You guys are missing the whole point of the brain-in-a-vat hypothetical situation.</p>

<p>

This isn’t interpreting the brain-in-a-vat situation correctly.
In the brain-in-a-vat hypothetical situation, you are not able to “conjure up” sensations. The point is that a separate machine/entity (what that entity is, is not relevant to this situation) is sending electrical impulses to your brain, constructing “reality” for you. In other words, if you see a wallet in front of you, it is not because your brain has conjured up that image, but because the wires connected to your brain (where the optic nerve would have connected) have sent it an electrical pulse, which is interpreted by your nerves as the image of a wallet.</p>

<p>The question of “Would you want to live in a perfect fantasy where you have godlike powers?” is an entirely different and unrelated question.</p>

<p>Correct, so I’ll revise my question. Would you rather be fed sensations from a machine or would you rather live as a normal person? What if the all the machine fed you were pleasurable moments for you to experience?</p>

<p>

</p>

<p>Ah, perhaps this is so. But casual dismissals and sarcasm are easier, more fun (for me), and — to most audiences — more convincing.</p>

<p>

</p>

<p>My conclusion would be: someone has too much time and too many resources and doesn’t have anything good to do with it all.</p>

<p>

</p>

<p>The same thing it was before: an objective standard (in this case, simulated) that brains interpret through metaphor in an attempt to simplify an information overload into a few consistent patterns.</p>

<p>

</p>

<p>I… I am speechless. </p>

<p>I couldn’t have said it better myself.</p>

<p>

</p>

<p>The threat is so much more subtle. We won’t have to worry about computers by themselves. We’re going to enhance ourselves with genetic manipulation and, perhaps, computational add-ons. I’d be far more worried about what will happen to the human experience subsequent to such additions and “perfections.” Perhaps not every part of our experience is worth clinging to, but some part of it might be, and I don’t wish to lose it that part — or worse (from my perspective), see my children and grandchildren lose it.</p>

<p>

</p>

<p>Then you should practice rhetoric - certainly a very useful skill. Logic is not meant to appeal to the lowest common denominator.</p>

<p>

Again, this misinterprets the thought experiment. The hypothetical situation is only meant to question the nature of reality; it is not meant to provoke thought on whether or not people have too much time on their hands.</p>

<p>

This description is vague to the point of being incomprehensible, meaningless, and possibly self-contradictory. </p>

<p>

The human condition (the sum of living human experience) has been changing for centuries: agriculture, sedentary lifestyles, iron, writing, political theories, gunpowder, steam, electricity, telephones, nuclear weapons, computers, genetics, and the Internet. Just because an idea or technology (genetic enhancement) will possibly alter human experience in some vaguely defined and as yet unknown way does not mean it will lead to the end of human survival.</p>

<p>terenc, jimbo actually is optimistic about humans not becoming extinct soon, even though he thinks the human condition may change in the future.</p>

<p>yeah, technology won’t lead to human extinction until it does. before it does, there will be a long history of technology not leading immediately to extinction. technology can only be responsible for extinction of humans once.</p>

<p>

</p>

<p>You’re just so silly. I feel like I’m having this conversation:</p>

<p>“What if goats were actually pigs?”</p>

<p>“What a stupid question. Then they wouldn’t be goats. They’d be pigs.” </p>

<p>“Ad hominem! And you’re missing the point! The thought experiment is meant to provoke thought on the nature of species distinction!”</p>

<p>“My conclusion is: they’d be pigs.”</p>

<p>“No! You don’t get it, stupid. It’s not about that.”</p>

<p>Only ours is a more… profound… conversation, yes?</p>

<p>Perhaps the thing missing most here is the part where you offer some of the insight that this so-deep thought experiment has conferred upon you.</p>

<p>

</p>

<p>A might be “A,” but perhaps not. If not, how can one logically confirm such a concept. Hmmm. Ponderous. </p>

<p>You must have more to say than this.</p>