Showing posts with label causality. Show all posts
Showing posts with label causality. Show all posts

Tuesday, March 10, 2009

Causal theories of mind

Suppose we have a causal theory of mind, like Lewis's. In this theory, states get their nature from the way they tend to causally interact. Now, suppose that Black, our faithful neurosurgeon with his neuroscope, observes Freddy's brain for all of Freddy's life. Moreover, Black has a script for how every neural event in Freddy's life is to go. As soon as there is a deviation from the script, Black blows up Freddy. Now, all of the counterfactuals about Freddy's neural states are destroyed. For instance, assuming it's not in Black's script that Freddy ever is visually aware of a giraffe, then instead of having the counterfactual: "Were a giraffe in Freddy's field of vision, Freddy would likely form such-and-such a belief", we have the counterfactual: "Were a giraffe in Freddy's field of vision, Freddy would explode." But suppose that all goes according to script. Then the neurosurgeon doesn't interfere, and so Freddy thinks like everybody else—despite all the counterfactuals being all wrong. If the causal theory is defined by counterfactuals about the mental states, this refutes the causal theory.

OK, that was too quick. Maybe the idea is to look at the counterfactuals that would hold of Freddy in a "normal environment" to define the states. But that won't do. Consider the mental state of seeing that things aren't normal. We can't define that simply in terms of normal environments I bet. Moreover, even supposing we can somehow abstract Freddy from his environment, we could make Black be a part of Freddy. How? Well, make Black be a little robot. Then give this robot one more function: it is also a very reliable artificial heart. Then implant the robot in Freddy's chest in place of his heart. It no longer makes sense to ask how Freddy would act in the absence of Black, since in the absence of Black—who is now Freddy's artificial heart—Freddy would be dead.

Maybe you think that Freddy is just a brain, so the heart is just part of the environment. Fine. Take some part of the brain that is important only for supplying nutrition to the rest of the brain, but that is computationally irrelevant. Replace it by Black (a robot that fulfills the functions of that part, but that would blow up Freddy were Freddy to depart from the script). And again we've got a problem.

We can perhaps even put Black more intimately inside Freddy. We could make Black be a mental process of Freddy's that monitors adherence to the script.

So the causal theory requires a counterfactual-free account of causal roles. The only option I see is an Aristotelian one. So the prospects for a causal theory of mind that uses only the ingredients of post-Aristotelian science are slim.