Several people have suggested to me the idea that capacity for suffering may vary in rough proportion with -- or at least according to some approximately monotonic function of -- brain size. (I'll use "brain size" as a shorthand term referring to the amount of neural tissue an organism has. Perhaps a more relevant measure, though one for which it's harder to find good statistics, is the amount of neural tissue devoted specifically to producing pain emotions, rather than, say, vision processing or planning.) When I first heard this idea, I found it somewhat surprising. Introspectively, consciousness feels like a binary state: I'm aware and feeling emotions now, but I don't do either to any degree when I'm under general anesthesia. I can't recall feeling "more intensely conscious" at one time rather than another, unless you count the groggy state right before falling asleep. On the other hand, personal introspection doesn't prove much, because during my adult life, I've always had the same brain size, so that even if sentience did vary with brain size, I wouldn't know it. (I might be able to notice the difference if my child self had experienced less intense emotions than my adult self. Introspectively this doesn't feel true, but I don't remember my feelings as a very young child well enough to be sure.)
There do seem to be some good arguments for the size-proportionality position.
- As the cortical homunculus notion suggests, regions of the body that are more sensitive (e.g., the hands) have far more neurons devoted to them in the brain. My feelings of touch are much more refined (and somewhat more intense) on my hands than on my back. Presumably the same would be true of pain nerves.
- Painkillers work by reducing the number of pain signals that are produced or the number that actually reach the spinal cord and brain. A smaller number of signals translates into less intense experience.
- We suspect that other abilities of the brain also vary with its size, notably intelligence. That ability is apparently not binary, despite the fact that it feels somewhat binary to any one individual. (I do have a sense of my intelligence varying from moment to moment, but not drastically.)
However, there are some objections.
- If sentience scales with brain size, then would men have more intensity of emotion than women, since their brains are larger? And should we worry less about pain than adults? Maybe -- politically unpopular ideas needn't be incorrect.
- Intelligence does not seem to be related to absolute brain size but more to relative brain size. Men and women seem to have basically the same IQs even though men have an extra 4 billion brain cells. Moreover, brain size scales very closely with body size (especially lean body mass), so that in fact, whales have much bigger brains (4000-7000 g) than humans (1300-1700 g). Indeed, the standard metric of interest in intelligence studies is not absolute brain size but brain-to-body mass ratio, or Encephalization Quotient (EQ). Shouldn't we expect a similar sort of trend for emotional intensity?
- In his "Toward Welfare Biology," Yew-Kwang Ng proposes the principle that because hedonic experiences are energetically costly, evolution generally endows organisms with the minimal levels of emotion needed to motivate them to action, perhaps to differential degrees depending on the situation. Ng is an economist rather than a biologist, so I'm not sure how realistic this assumption is, but it does seem true that intense emotion is more metabolically taxing -- thus explaining why we're close to hedonic neutrality most of the time. The question for the sentience-scaling view is then why evolution would give larger-brained organisms substantially more intense emotions in order to motivate them to do similar sorts of things (eat, mate, avoid predators). Maybe the reply would be that, because of their greater prevalence of neurons, larger organisms simply experience greater emotional intensity for the same level of metabolic cost? Or maybe larger-brained organisms, being perhaps more intelligent, have a larger number of options available to them and so need a larger number of levels of pleasure and pain for motivating a wider range of possible responses?
- If consciousness results from implementing the right algorithm, then maybe it doesn't matter exactly how that algorithm is run? This suggests the notion that consciousness is either on or off, at least for serial computers. As an illustration, there are lots of functions for computing factorial(n) -- some fast, some slow, some simple, some complex -- but whether a given function computes factorial(n) is either true or false. It doesn't depend on lines of code or the computational burden of running the code. (On the other hand, the number of instances of factorial(n) that can be run on a given machine does depend on the latter factor.)
- Certain qualia, such as the redness of red, do seem to be binary -- indeed, that's the whole premise behind David Chalmers's "fading qualia" thought experiment. Might consciousness itself be the same way? This is especially interesting viewed in light of the suggestion that consciousness should be substrate-independent. I'm not sure if qualia can be produced by Turing-machine simulations, but if so, what does "brain size" look like in this context? Of course, we do know that different degrees of pain and pleasure are possible, so this last point may be a red herring.
- On the painkiller point, it may be that fewer pain signals translates into less pain for a given organism. But maybe what matters are the relative amounts of pain signals vs. other signals the brain receives. Larger brains have to process more total inputs. Maybe a tiny brain receiving only a few pain signals feels subjectively worse than a large brain that, while receiving a larger number of pain signals, is distracted by lots of other information. (The gate control theory of pain may be relevant here?)
The question of if and how emotional intensity scales with brain size is, in my view, extremely important, because it affects the urgency we attach to potential
suffering by insects in the wild, assuming they
can feel pain at all. If, for instance, insect suffering is 1000 times less intense than human suffering, then we should discount the
10^18 insects in the world not only by their probability of sentience (say, 0.1, to use a round number) but also by their reduced emotional intensity if sentient (0.001). In that case, there would be effectively 10^14 human-equivalent insects, much closer to the 10^10 humans that exist.
There's a fundamental problem here, though. A good Bayesian will not pick a point estimate for the ratio of insect sentience to human sentience but will maintain a probability distribution over plausible ratios. In view of the uncertainty on this question,
I think it's reasonable to maintain some probability on the hypothesis that insects do suffer about as much as humans. For instance, maybe we assign probability 0.2 to insects being able to suffer at least half as much. But in that case, an upper bound on the "expected ratio" of sentience between insects and humans is 0.1, which implies much more concern for insects than a ratio like 0.001, even if the latter is the value we consider most likely.
Note that if number of neurons devoted to pain processing is the relevant measure, then the disparity between insects and humans will probably be smaller than straight division based on brain mass would suggest, since I assume humans devote a larger fraction of neurons to non-hedonic brain functions. On the other hand, humans are endothermic and, so a friend tells me, have a larger proportion of synapses than insects. Exactly how to incorporate these factors into a weighting even if we do apply one is not obvious.
Edit on 23 June 2012:I wrote the following post in May 2009, back when I was confused about consciousness and didn't fully comprehend the reductionist interpretation of it. So forgive some of the language, but most of the content is still sensible and important.
My current position on this topic is as follows. I remain uncertain whether I want to give greater weight to more powerful brains (i.e., ones with greater computational throughput). On the one hand,
Bostrom's thought experiment (discussed further below) is compelling. But on the other hand, I have an intuition that what matters most is a single organism's unified experience, and I might bite the bullet and say that if a single mind is divided into two separate minds, then the moral weight has thereby doubled. After all, there's no law of conservation of consciousness analogous to the law of conservation of mass/energy. If the context changes, by converting one unified conscious actor into two, then our assessment can change.
Let me put it another way. It may be that the amount of neural tissue that responds when I prick my finger is more than the amount that responds when a fruit fly is shocked to death. (I don't know if this is true, but
I have almost a million times more neurons than a fruit fly.) However, the pinprick is trivial to me, but to the fruit fly, the pain that it endures before death comes is the most overwhelming thing in the world. There's a unified quality to the fruit fly's experience that isn't captured merely by number of neurons, or even a more sophisticated notion of computational horsepower of the brain.
There are many more subtleties and arguments to bear on this tough question, but I shall leave that to the discussion which follows. So without further ado, here's the original post.
Original piece from May 2009:--------------------------
Several people have suggested to me the idea that capacity for suffering may vary in rough proportion with -- or at least according to some approximately monotonic function of -- brain size. (I'll use "brain size" as a shorthand term referring to the amount of neural tissue an organism has. Perhaps a more relevant measure, though one for which it's harder to find good statistics, is the amount of neural tissue devoted specifically to producing pain emotions, rather than, say, vision processing or planning.) When I first heard this idea, I found it somewhat surprising. Introspectively, consciousness feels like a binary state: I'm aware and feeling emotions now, but I don't do either to any degree when I'm under general anesthesia. I can't recall feeling "more intensely conscious" at one time rather than another, unless you count the groggy state right before falling asleep. On the other hand, personal introspection doesn't prove much, because during my adult life, I've always had the same brain size, so that even if sentience did vary with brain size, I wouldn't know it. (I might be able to notice the difference if my child self had experienced less intense emotions than my adult self. Introspectively this doesn't feel true, but I don't remember my feelings as a very young child well enough to be sure.)
There do seem to be some good arguments for the size-proportionality position.
- As the cortical homunculus notion suggests, regions of the body that are more sensitive (e.g., the hands) have far more neurons devoted to them in the brain. My feelings of touch are much more refined (and somewhat more intense) on my hands than on my back. Presumably the same would be true of pain nerves.
- Painkillers work by reducing the number of pain signals that are produced or the number that actually reach the spinal cord and brain. A smaller number of signals translates into less intense experience.
- We suspect that other abilities of the brain also vary with its size, notably intelligence. That ability is apparently not binary, despite the fact that it feels somewhat binary to any one individual. (I do have a sense of my intelligence varying from moment to moment, but not drastically.)
However, there are some objections.
- If sentience scales with brain size, then would men have more intensity of emotion than women, since their brains are larger? And should we worry less about pain in children than adults? Maybe -- politically unpopular ideas needn't be incorrect.
- Intelligence does not seem to be related to absolute brain size but more to relative brain size. Men and women seem to have basically the same IQs even though men have an extra 4 billion brain cells. Moreover, brain size scales very closely with body size (especially lean body mass), so that in fact, whales have much bigger brains (4000-7000 g) than humans (1300-1700 g). Indeed, the standard metric of interest in intelligence studies is not absolute brain size but brain-to-body mass ratio, or Encephalization Quotient (EQ). Shouldn't we expect a similar sort of trend for emotional intensity?
- In his "Toward Welfare Biology," Yew-Kwang Ng proposes the principle that because hedonic experiences are energetically costly, evolution generally endows organisms with the minimal levels of emotion needed to motivate them to action, perhaps to differential degrees depending on the situation. Ng is an economist rather than a biologist, so I'm not sure how realistic this assumption is, but it does seem true that intense emotion is more metabolically taxing -- thus explaining why we're close to hedonic neutrality most of the time. The question for the sentience-scaling view is then why evolution would give larger-brained organisms substantially more intense emotions in order to motivate them to do similar sorts of things (eat, mate, avoid predators). Maybe the reply would be that, because of their greater prevalence of neurons, larger organisms simply experience greater emotional intensity for the same level of metabolic cost? Or maybe larger-brained organisms, being perhaps more intelligent, have a larger number of options available to them and so need a larger number of levels of pleasure and pain for motivating a wider range of possible responses?
- If consciousness results from implementing the right algorithm, then maybe it doesn't matter exactly how that algorithm is run? This suggests the notion that consciousness is either on or off, at least for serial computers. As an illustration, there are lots of functions for computing factorial(n) -- some fast, some slow, some simple, some complex -- but whether a given function computes factorial(n) is either true or false. It doesn't depend on lines of code or the computational burden of running the code. (On the other hand, the number of instances of factorial(n) that can be run on a given machine does depend on the latter factor.)
- Certain qualia, such as the redness of red, do seem to be binary -- indeed, that's the whole premise behind David Chalmers's "fading qualia" thought experiment. Might consciousness itself be the same way? This is especially interesting viewed in light of the suggestion that consciousness should be substrate-independent. I'm not sure if qualia can be produced by Turing-machine simulations, but if so, what does "brain size" look like in this context? Of course, we do know that different degrees of pain and pleasure are possible, so this last point may be a red herring.
- On the painkiller point, it may be that fewer pain signals translates into less pain for a given organism. But maybe what matters are the relative amounts of pain signals vs. other signals the brain receives. Larger brains have to process more total inputs. Maybe a tiny brain receiving only a few pain signals feels subjectively worse than a large brain that, while receiving a larger number of pain signals, is distracted by lots of other information. (The gate control theory of pain may be relevant here?)
The question of if and how emotional intensity scales with brain size is, in my view, extremely important, because it affects the urgency we attach to potential
suffering by insects in the wild, assuming they
can feel pain at all. If, for instance, insect suffering is 1000 times less intense than human suffering, then we should discount the
10^18 insects in the world not only by their probability of sentience (say, 0.1, to use a round number) but also by their reduced emotional intensity if sentient (0.001). In that case, there would be effectively 10^14 human-equivalent insects, much closer to the 10^10 humans that exist.
There's a fundamental problem here, though. A good Bayesian will not pick a point estimate for the ratio of insect sentience to human sentience but will maintain a probability distribution over plausible ratios. In view of the uncertainty on this question,
I think it's reasonable to maintain some probability on the hypothesis that insects do suffer about as much as humans. For instance, maybe we assign probability 0.2 to insects being able to suffer at least half as much. But in that case, an upper bound on the "expected ratio" of sentience between insects and humans is 0.1, which implies much more concern for insects than a ratio like 0.001, even if the latter is the value we consider most likely.
Note that if number of neurons devoted to pain processing is the relevant measure, then the disparity between insects and humans will probably be smaller than straight division based on brain mass would suggest, since I assume humans devote a larger fraction of neurons to non-hedonic brain functions. On the other hand, humans are endothermic and, so a friend tells me, have a larger proportion of synapses than insects. Exactly how to incorporate these factors into a weighting even if we do apply one is not obvious.
Edit on 23 June 2012:I wrote the following post in May 2009, back when I was confused about consciousness and didn't fully comprehend the reductionist interpretation of it. So forgive some of the language, but most of the content is still sensible and important.
My current position on this topic is as follows. I remain uncertain whether I want to give greater weight to more powerful brains (i.e., ones with greater computational throughput). On the one hand,
Bostrom's thought experiment (discussed further below) is compelling. But on the other hand, I have an intuition that what matters most is a single organism's unified experience, and I might bite the bullet and say that if a single mind is divided into two separate minds, then the moral weight has thereby doubled. After all, there's no law of conservation of consciousness analogous to the law of conservation of mass/energy. If the context changes, by converting one unified conscious actor into two, then our assessment can change.
Let me put it another way. It may be that the amount of neural tissue that responds when I prick my finger is more than the amount that responds when a fruit fly is shocked to death. (I don't know if this is true, but
I have almost a million times more neurons than a fruit fly.) However, the pinprick is trivial to me, but to the fruit fly, the pain that it endures before death comes is the most overwhelming thing in the world. There's a unified quality to the fruit fly's experience that isn't captured merely by number of neurons, or even a more sophisticated notion of computational horsepower of the brain.
Another factor
that may be relevant is the "clock speed" of the brain in question. If smaller animals tend to have higher clock speeds (at least within a given class of animals, like mammals), then other things being equal, they would get more weight in our calculations, although it's not clear how big the effect size would be.
There are many more subtleties and arguments to bear on this tough question, but I shall leave that to the discussion which follows. So without further ado, here's the original post.
Original piece from May 2009:--------------------------
Several people have suggested to me the idea that capacity for suffering may vary in rough proportion with -- or at least according to some approximately monotonic function of -- brain size. (I'll use "brain size" as a shorthand term referring to the amount of neural tissue an organism has. Perhaps a more relevant measure, though one for which it's harder to find good statistics, is the amount of neural tissue devoted specifically to producing pain emotions, rather than, say, vision processing or planning.) When I first heard this idea, I found it somewhat surprising. Introspectively, consciousness feels like a binary state: I'm aware and feeling emotions now, but I don't do either to any degree when I'm under general anesthesia. I can't recall feeling "more intensely conscious" at one time rather than another, unless you count the groggy state right before falling asleep. On the other hand, personal introspection doesn't prove much, because during my adult life, I've always had the same brain size, so that even if sentience did vary with brain size, I wouldn't know it. (I might be able to notice the difference if my child self had experienced less intense emotions than my adult self. Introspectively this doesn't feel true, but I don't remember my feelings as a very young child well enough to be sure.)
There do seem to be some good arguments for the size-proportionality position.
- As the cortical homunculus notion suggests, regions of the body that are more sensitive (e.g., the hands) have far more neurons devoted to them in the brain. My feelings of touch are much more refined (and somewhat more intense) on my hands than on my back. Presumably the same would be true of pain nerves.
- Painkillers work by reducing the number of pain signals that are produced or the number that actually reach the spinal cord and brain. A smaller number of signals translates into less intense experience.
- We suspect that other abilities of the brain also vary with its size, notably intelligence. That ability is apparently not binary, despite the fact that it feels somewhat binary to any one individual. (I do have a sense of my intelligence varying from moment to moment, but not drastically.)
However, there are some objections.
- If sentience scales with brain size, then would men have more intensity of emotion than women, since their brains are larger? And should we worry less about pain in children than adults? Maybe -- politically unpopular ideas needn't be incorrect.
- Intelligence does not seem to be related to absolute brain size but more to relative brain size. Men and women seem to have basically the same IQs even though men have an extra 4 billion brain cells. Moreover, brain size scales very closely with body size (especially lean body mass), so that in fact, whales have much bigger brains (4000-7000 g) than humans (1300-1700 g). Indeed, the standard metric of interest in intelligence studies is not absolute brain size but brain-to-body mass ratio, or Encephalization Quotient (EQ). Shouldn't we expect a similar sort of trend for emotional intensity?
- In his "Toward Welfare Biology," Yew-Kwang Ng proposes the principle that because hedonic experiences are energetically costly, evolution generally endows organisms with the minimal levels of emotion needed to motivate them to action, perhaps to differential degrees depending on the situation. Ng is an economist rather than a biologist, so I'm not sure how realistic this assumption is, but it does seem true that intense emotion is more metabolically taxing -- thus explaining why we're close to hedonic neutrality most of the time. The question for the sentience-scaling view is then why evolution would give larger-brained organisms substantially more intense emotions in order to motivate them to do similar sorts of things (eat, mate, avoid predators). Maybe the reply would be that, because of their greater prevalence of neurons, larger organisms simply experience greater emotional intensity for the same level of metabolic cost? Or maybe larger-brained organisms, being perhaps more intelligent, have a larger number of options available to them and so need a larger number of levels of pleasure and pain for motivating a wider range of possible responses?
- If consciousness results from implementing the right algorithm, then maybe it doesn't matter exactly how that algorithm is run? This suggests the notion that consciousness is either on or off, at least for serial computers. As an illustration, there are lots of functions for computing factorial(n) -- some fast, some slow, some simple, some complex -- but whether a given function computes factorial(n) is either true or false. It doesn't depend on lines of code or the computational burden of running the code. (On the other hand, the number of instances of factorial(n) that can be run on a given machine does depend on the latter factor.)
- Certain qualia, such as the redness of red, do seem to be binary -- indeed, that's the whole premise behind David Chalmers's "fading qualia" thought experiment. Might consciousness itself be the same way? This is especially interesting viewed in light of the suggestion that consciousness should be substrate-independent. I'm not sure if qualia can be produced by Turing-machine simulations, but if so, what does "brain size" look like in this context? Of course, we do know that different degrees of pain and pleasure are possible, so this last point may be a red herring.
- On the painkiller point, it may be that fewer pain signals translates into less pain for a given organism. But maybe what matters are the relative amounts of pain signals vs. other signals the brain receives. Larger brains have to process more total inputs. Maybe a tiny brain receiving only a few pain signals feels subjectively worse than a large brain that, while receiving a larger number of pain signals, is distracted by lots of other information. (The gate control theory of pain may be relevant here?)
The question of if and how emotional intensity scales with brain size is, in my view, extremely important, because it affects the urgency we attach to potential
suffering by insects in the wild, assuming they
can feel pain at all. If, for instance, insect suffering is 1000 times less intense than human suffering, then we should discount the
10^18 insects in the world not only by their probability of sentience (say, 0.1, to use a round number) but also by their reduced emotional intensity if sentient (0.001). In that case, there would be effectively 10^14 human-equivalent insects, much closer to the 10^10 humans that exist.
There's a fundamental problem here, though. A good Bayesian will not pick a point estimate for the ratio of insect sentience to human sentience but will maintain a probability distribution over plausible ratios. In view of the uncertainty on this question,
I think it's reasonable to maintain some probability on the hypothesis that insects do suffer about as much as humans. For instance, maybe we assign probability 0.2 to insects being able to suffer at least half as much. But in that case, an upper bound on the "expected ratio" of sentience between insects and humans is 0.1, which implies much more concern for insects than a ratio like 0.001, even if the latter is the value we consider most likely.
Note that if number of neurons devoted to pain processing is the relevant measure, then the disparity between insects and humans will probably be smaller than straight division based on brain mass would suggest, since I assume humans devote a larger fraction of neurons to non-hedonic brain functions. On the other hand, humans are endothermic and, so a friend tells me, have a larger proportion of synapses than insects. Exactly how to incorporate these factors into a weighting even if we do apply one is not obvious.
Edit on 23 June 2012:I wrote the following post in May 2009, back when I was confused about consciousness and didn't fully comprehend the reductionist interpretation of it. So forgive some of the language, but most of the content is still sensible and important.
My current position on this topic is as follows. I remain uncertain whether I want to give greater weight to more powerful brains (i.e., ones with greater computational throughput). On the one hand,
Bostrom's thought experiment (discussed further below) is compelling. But on the other hand, I have an intuition that what matters most is a single organism's unified experience, and I might bite the bullet and say that if a single mind is divided into two separate minds, then the moral weight has thereby doubled. After all, there's no law of conservation of consciousness analogous to the law of conservation of mass/energy. If the context changes, by converting one unified conscious actor into two, then our assessment can change.
Let me put it another way. It may be that the amount of neural tissue that responds when I prick my finger is more than the amount that responds when a fruit fly is shocked to death. (I don't know if this is true, but
I have almost a million times more neurons than a fruit fly.) However, the pinprick is trivial to me, but to the fruit fly, the pain that it endures before death comes is the most overwhelming thing in the world. There's a unified quality to the fruit fly's experience that isn't captured merely by number of neurons, or even a more sophisticated notion of computational horsepower of the brain.
Another factor
that may be relevant is the "clock speed" of the brain in question. If smaller animals tend to have higher clock speeds (at least within a given class of animals, like mammals), then other things being equal, they would get more weight in our calculations, although it's not clear how big the effect size would be.
In addition,
if anthropics are not weighted by brain size, then not giving moral weight to brain size begins to seem more plausible as well.
There are many more subtleties and arguments to bear on this tough question, but I shall leave that to the discussion which follows. So without further ado, here's the original post.
Original piece from May 2009:--------------------------
Several people have suggested to me the idea that capacity for suffering may vary in rough proportion with -- or at least according to some approximately monotonic function of -- brain size. (I'll use "brain size" as a shorthand term referring to the amount of neural tissue an organism has. Perhaps a more relevant measure, though one for which it's harder to find good statistics, is the amount of neural tissue devoted specifically to producing pain emotions, rather than, say, vision processing or planning.) When I first heard this idea, I found it somewhat surprising. Introspectively, consciousness feels like a binary state: I'm aware and feeling emotions now, but I don't do either to any degree when I'm under general anesthesia. I can't recall feeling "more intensely conscious" at one time rather than another, unless you count the groggy state right before falling asleep. On the other hand, personal introspection doesn't prove much, because during my adult life, I've always had the same brain size, so that even if sentience did vary with brain size, I wouldn't know it. (I might be able to notice the difference if my child self had experienced less intense emotions than my adult self. Introspectively this doesn't feel true, but I don't remember my feelings as a very young child well enough to be sure.)
There do seem to be some good arguments for the size-proportionality position.
- As the cortical homunculus notion suggests, regions of the body that are more sensitive (e.g., the hands) have far more neurons devoted to them in the brain. My feelings of touch are much more refined (and somewhat more intense) on my hands than on my back. Presumably the same would be true of pain nerves.
- Painkillers work by reducing the number of pain signals that are produced or the number that actually reach the spinal cord and brain. A smaller number of signals translates into less intense experience.
- We suspect that other abilities of the brain also vary with its size, notably intelligence. That ability is apparently not binary, despite the fact that it feels somewhat binary to any one individual. (I do have a sense of my intelligence varying from moment to moment, but not drastically.)
However, there are some objections.
- If sentience scales with brain size, then would men have more intensity of emotion than women, since their brains are larger? And should we worry less about pain in children than adults? Maybe -- politically unpopular ideas needn't be incorrect.
- Intelligence does not seem to be related to absolute brain size but more to relative brain size. Men and women seem to have basically the same IQs even though men have an extra 4 billion brain cells. Moreover, brain size scales very closely with body size (especially lean body mass), so that in fact, whales have much bigger brains (4000-7000 g) than humans (1300-1700 g). Indeed, the standard metric of interest in intelligence studies is not absolute brain size but brain-to-body mass ratio, or Encephalization Quotient (EQ). Shouldn't we expect a similar sort of trend for emotional intensity?
- In his "Toward Welfare Biology," Yew-Kwang Ng proposes the principle that because hedonic experiences are energetically costly, evolution generally endows organisms with the minimal levels of emotion needed to motivate them to action, perhaps to differential degrees depending on the situation. Ng is an economist rather than a biologist, so I'm not sure how realistic this assumption is, but it does seem true that intense emotion is more metabolically taxing -- thus explaining why we're close to hedonic neutrality most of the time. The question for the sentience-scaling view is then why evolution would give larger-brained organisms substantially more intense emotions in order to motivate them to do similar sorts of things (eat, mate, avoid predators). Maybe the reply would be that, because of their greater prevalence of neurons, larger organisms simply experience greater emotional intensity for the same level of metabolic cost? Or maybe larger-brained organisms, being perhaps more intelligent, have a larger number of options available to them and so need a larger number of levels of pleasure and pain for motivating a wider range of possible responses?
- If consciousness results from implementing the right algorithm, then maybe it doesn't matter exactly how that algorithm is run? This suggests the notion that consciousness is either on or off, at least for serial computers. As an illustration, there are lots of functions for computing factorial(n) -- some fast, some slow, some simple, some complex -- but whether a given function computes factorial(n) is either true or false. It doesn't depend on lines of code or the computational burden of running the code. (On the other hand, the number of instances of factorial(n) that can be run on a given machine does depend on the latter factor.)
- Certain qualia, such as the redness of red, do seem to be binary -- indeed, that's the whole premise behind David Chalmers's "fading qualia" thought experiment. Might consciousness itself be the same way? This is especially interesting viewed in light of the suggestion that consciousness should be substrate-independent. I'm not sure if qualia can be produced by Turing-machine simulations, but if so, what does "brain size" look like in this context? Of course, we do know that different degrees of pain and pleasure are possible, so this last point may be a red herring.
- On the painkiller point, it may be that fewer pain signals translates into less pain for a given organism. But maybe what matters are the relative amounts of pain signals vs. other signals the brain receives. Larger brains have to process more total inputs. Maybe a tiny brain receiving only a few pain signals feels subjectively worse than a large brain that, while receiving a larger number of pain signals, is distracted by lots of other information. (The gate control theory of pain may be relevant here?)
The question of if and how emotional intensity scales with brain size is, in my view, extremely important, because it affects the urgency we attach to potential
suffering by insects in the wild, assuming they
can feel pain at all. If, for instance, insect suffering is 1000 times less intense than human suffering, then we should discount the
10^18 insects in the world not only by their probability of sentience (say, 0.1, to use a round number) but also by their reduced emotional intensity if sentient (0.001). In that case, there would be effectively 10^14 human-equivalent insects, much closer to the 10^10 humans that exist.
There's a fundamental problem here, though. A good Bayesian will not pick a point estimate for the ratio of insect sentience to human sentience but will maintain a probability distribution over plausible ratios. In view of the uncertainty on this question,
I think it's reasonable to maintain some probability on the hypothesis that insects do suffer about as much as humans. For instance, maybe we assign probability 0.2 to insects being able to suffer at least half as much. But in that case, an upper bound on the "expected ratio" of sentience between insects and humans is 0.1, which implies much more concern for insects than a ratio like 0.001, even if the latter is the value we consider most likely.
Note that if number of neurons devoted to pain processing is the relevant measure, then the disparity between insects and humans will probably be smaller than straight division based on brain mass would suggest, since I assume humans devote a larger fraction of neurons to non-hedonic brain functions. On the other hand, humans are endothermic and, so a friend tells me, have a larger proportion of synapses than insects. Exactly how to incorporate these factors into a weighting even if we do apply one is not obvious.
Edit on 4 Aug. 2013:My current thoughts on this topic are
here.
Edit on 23 June 2012:I wrote the following post in May 2009, back when I was confused about consciousness and didn't fully comprehend the reductionist interpretation of it. So forgive some of the language, but most of the content is still sensible and important.
My current position on this topic is as follows. I remain uncertain whether I want to give greater weight to more powerful brains (i.e., ones with greater computational throughput). On the one hand,
Bostrom's thought experiment (discussed further below) is compelling. But on the other hand, I have an intuition that what matters most is a single organism's unified experience, and I might bite the bullet and say that if a single mind is divided into two separate minds, then the moral weight has thereby doubled. After all, there's no law of conservation of consciousness analogous to the law of conservation of mass/energy. If the context changes, by converting one unified conscious actor into two, then our assessment can change.
Let me put it another way. It may be that the amount of neural tissue that responds when I prick my finger is more than the amount that responds when a fruit fly is shocked to death. (I don't know if this is true, but
I have almost a million times more neurons than a fruit fly.) However, the pinprick is trivial to me, but to the fruit fly, the pain that it endures before death comes is the most overwhelming thing in the world. There's a unified quality to the fruit fly's experience that isn't captured merely by number of neurons, or even a more sophisticated notion of computational horsepower of the brain. See also
this discussion about Harry the human and Sam the snail.
Another factor
that may be relevant is the "clock speed" of the brain in question. If smaller animals tend to have higher clock speeds (at least within a given class of animals, like mammals), then other things being equal, they would get more weight in our calculations, although it's not clear how big the effect size would be.
In addition,
if anthropics are not weighted by brain size, then not giving moral weight to brain size begins to seem more plausible as well.
There are many more subtleties and arguments to bear on this tough question, but I shall leave that to the discussion which follows. So without further ado, here's the original post.
Original piece from May 2009:--------------------------
Several people have suggested to me the idea that capacity for suffering may vary in rough proportion with -- or at least according to some approximately monotonic function of -- brain size. (I'll use "brain size" as a shorthand term referring to the amount of neural tissue an organism has. Perhaps a more relevant measure, though one for which it's harder to find good statistics, is the amount of neural tissue devoted specifically to producing pain emotions, rather than, say, vision processing or planning.) When I first heard this idea, I found it somewhat surprising. Introspectively, consciousness feels like a binary state: I'm aware and feeling emotions now, but I don't do either to any degree when I'm under general anesthesia. I can't recall feeling "more intensely conscious" at one time rather than another, unless you count the groggy state right before falling asleep. On the other hand, personal introspection doesn't prove much, because during my adult life, I've always had the same brain size, so that even if sentience did vary with brain size, I wouldn't know it. (I might be able to notice the difference if my child self had experienced less intense emotions than my adult self. Introspectively this doesn't feel true, but I don't remember my feelings as a very young child well enough to be sure.)
There do seem to be some good arguments for the size-proportionality position.
- As the cortical homunculus notion suggests, regions of the body that are more sensitive (e.g., the hands) have far more neurons devoted to them in the brain. My feelings of touch are much more refined (and somewhat more intense) on my hands than on my back. Presumably the same would be true of pain nerves.
- Painkillers work by reducing the number of pain signals that are produced or the number that actually reach the spinal cord and brain. A smaller number of signals translates into less intense experience.
- We suspect that other abilities of the brain also vary with its size, notably intelligence. That ability is apparently not binary, despite the fact that it feels somewhat binary to any one individual. (I do have a sense of my intelligence varying from moment to moment, but not drastically.)
However, there are some objections.
- If sentience scales with brain size, then would men have more intensity of emotion than women, since their brains are larger? And should we worry less about pain in children than adults? Maybe -- politically unpopular ideas needn't be incorrect.
- Intelligence does not seem to be related to absolute brain size but more to relative brain size. Men and women seem to have basically the same IQs even though men have an extra 4 billion brain cells. Moreover, brain size scales very closely with body size (especially lean body mass), so that in fact, whales have much bigger brains (4000-7000 g) than humans (1300-1700 g). Indeed, the standard metric of interest in intelligence studies is not absolute brain size but brain-to-body mass ratio, or Encephalization Quotient (EQ). Shouldn't we expect a similar sort of trend for emotional intensity?
- In his "Toward Welfare Biology," Yew-Kwang Ng proposes the principle that because hedonic experiences are energetically costly, evolution generally endows organisms with the minimal levels of emotion needed to motivate them to action, perhaps to differential degrees depending on the situation. Ng is an economist rather than a biologist, so I'm not sure how realistic this assumption is, but it does seem true that intense emotion is more metabolically taxing -- thus explaining why we're close to hedonic neutrality most of the time. The question for the sentience-scaling view is then why evolution would give larger-brained organisms substantially more intense emotions in order to motivate them to do similar sorts of things (eat, mate, avoid predators). Maybe the reply would be that, because of their greater prevalence of neurons, larger organisms simply experience greater emotional intensity for the same level of metabolic cost? Or maybe larger-brained organisms, being perhaps more intelligent, have a larger number of options available to them and so need a larger number of levels of pleasure and pain for motivating a wider range of possible responses?
- If consciousness results from implementing the right algorithm, then maybe it doesn't matter exactly how that algorithm is run? This suggests the notion that consciousness is either on or off, at least for serial computers. As an illustration, there are lots of functions for computing factorial(n) -- some fast, some slow, some simple, some complex -- but whether a given function computes factorial(n) is either true or false. It doesn't depend on lines of code or the computational burden of running the code. (On the other hand, the number of instances of factorial(n) that can be run on a given machine does depend on the latter factor.)
- Certain qualia, such as the redness of red, do seem to be binary -- indeed, that's the whole premise behind David Chalmers's "fading qualia" thought experiment. Might consciousness itself be the same way? This is especially interesting viewed in light of the suggestion that consciousness should be substrate-independent. I'm not sure if qualia can be produced by Turing-machine simulations, but if so, what does "brain size" look like in this context? Of course, we do know that different degrees of pain and pleasure are possible, so this last point may be a red herring.
- On the painkiller point, it may be that fewer pain signals translates into less pain for a given organism. But maybe what matters are the relative amounts of pain signals vs. other signals the brain receives. Larger brains have to process more total inputs. Maybe a tiny brain receiving only a few pain signals feels subjectively worse than a large brain that, while receiving a larger number of pain signals, is distracted by lots of other information. (The gate control theory of pain may be relevant here?)
The question of if and how emotional intensity scales with brain size is, in my view, extremely important, because it affects the urgency we attach to potential
suffering by insects in the wild, assuming they
can feel pain at all. If, for instance, insect suffering is 1000 times less intense than human suffering, then we should discount the
10^18 insects in the world not only by their probability of sentience (say, 0.1, to use a round number) but also by their reduced emotional intensity if sentient (0.001). In that case, there would be effectively 10^14 human-equivalent insects, much closer to the 10^10 humans that exist.
There's a fundamental problem here, though. A good Bayesian will not pick a point estimate for the ratio of insect sentience to human sentience but will maintain a probability distribution over plausible ratios. In view of the uncertainty on this question,
I think it's reasonable to maintain some probability on the hypothesis that insects do suffer about as much as humans. For instance, maybe we assign probability 0.2 to insects being able to suffer at least half as much. But in that case, an upper bound on the "expected ratio" of sentience between insects and humans is 0.1, which implies much more concern for insects than a ratio like 0.001, even if the latter is the value we consider most likely.
Note that if number of neurons devoted to pain processing is the relevant measure, then the disparity between insects and humans will probably be smaller than straight division based on brain mass would suggest, since I assume humans devote a larger fraction of neurons to non-hedonic brain functions. On the other hand, humans are endothermic and, so a friend tells me, have a larger proportion of synapses than insects. Exactly how to incorporate these factors into a weighting even if we do apply one is not obvious.
Edit on 4 Aug. 2013:My current thoughts on this topic: "
Is Brain Size Morally Relevant?"
Edit on 23 June 2012:I wrote the following post in May 2009, back when I was confused about consciousness and didn't fully comprehend the reductionist interpretation of it. So forgive some of the language, but most of the content is still sensible and important.
My current position on this topic is as follows. I remain uncertain whether I want to give greater weight to more powerful brains (i.e., ones with greater computational throughput). On the one hand,
Bostrom's thought experiment (discussed further below) is compelling. But on the other hand, I have an intuition that what matters most is a single organism's unified experience, and I might bite the bullet and say that if a single mind is divided into two separate minds, then the moral weight has thereby doubled. After all, there's no law of conservation of consciousness analogous to the law of conservation of mass/energy. If the context changes, by converting one unified conscious actor into two, then our assessment can change.
Let me put it another way. It may be that the amount of neural tissue that responds when I prick my finger is more than the amount that responds when a fruit fly is shocked to death. (I don't know if this is true, but
I have almost a million times more neurons than a fruit fly.) However, the pinprick is trivial to me, but to the fruit fly, the pain that it endures before death comes is the most overwhelming thing in the world. There's a unified quality to the fruit fly's experience that isn't captured merely by number of neurons, or even a more sophisticated notion of computational horsepower of the brain. See also
this discussion about Harry the human and Sam the snail.
Another factor
that may be relevant is the "clock speed" of the brain in question. If smaller animals tend to have higher clock speeds (at least within a given class of animals, like mammals), then other things being equal, they would get more weight in our calculations, although it's not clear how big the effect size would be.
In addition,
if anthropics are not weighted by brain size, then not giving moral weight to brain size begins to seem more plausible as well.
There are many more subtleties and arguments to bear on this tough question, but I shall leave that to the discussion which follows. So without further ado, here's the original post.
Original piece from May 2009:--------------------------
Several people have suggested to me the idea that capacity for suffering may vary in rough proportion with -- or at least according to some approximately monotonic function of -- brain size. (I'll use "brain size" as a shorthand term referring to the amount of neural tissue an organism has. Perhaps a more relevant measure, though one for which it's harder to find good statistics, is the amount of neural tissue devoted specifically to producing pain emotions, rather than, say, vision processing or planning.) When I first heard this idea, I found it somewhat surprising. Introspectively, consciousness feels like a binary state: I'm aware and feeling emotions now, but I don't do either to any degree when I'm under general anesthesia. I can't recall feeling "more intensely conscious" at one time rather than another, unless you count the groggy state right before falling asleep. On the other hand, personal introspection doesn't prove much, because during my adult life, I've always had the same brain size, so that even if sentience did vary with brain size, I wouldn't know it. (I might be able to notice the difference if my child self had experienced less intense emotions than my adult self. Introspectively this doesn't feel true, but I don't remember my feelings as a very young child well enough to be sure.)
There do seem to be some good arguments for the size-proportionality position.
- As the cortical homunculus notion suggests, regions of the body that are more sensitive (e.g., the hands) have far more neurons devoted to them in the brain. My feelings of touch are much more refined (and somewhat more intense) on my hands than on my back. Presumably the same would be true of pain nerves.
- Painkillers work by reducing the number of pain signals that are produced or the number that actually reach the spinal cord and brain. A smaller number of signals translates into less intense experience.
- We suspect that other abilities of the brain also vary with its size, notably intelligence. That ability is apparently not binary, despite the fact that it feels somewhat binary to any one individual. (I do have a sense of my intelligence varying from moment to moment, but not drastically.)
However, there are some objections.
- If sentience scales with brain size, then would men have more intensity of emotion than women, since their brains are larger? And should we worry less about pain in children than adults? Maybe -- politically unpopular ideas needn't be incorrect.
- Intelligence does not seem to be related to absolute brain size but more to relative brain size. Men and women seem to have basically the same IQs even though men have an extra 4 billion brain cells. Moreover, brain size scales very closely with body size (especially lean body mass), so that in fact, whales have much bigger brains (4000-7000 g) than humans (1300-1700 g). Indeed, the standard metric of interest in intelligence studies is not absolute brain size but brain-to-body mass ratio, or Encephalization Quotient (EQ). Shouldn't we expect a similar sort of trend for emotional intensity?
- In his "Toward Welfare Biology," Yew-Kwang Ng proposes the principle that because hedonic experiences are energetically costly, evolution generally endows organisms with the minimal levels of emotion needed to motivate them to action, perhaps to differential degrees depending on the situation. Ng is an economist rather than a biologist, so I'm not sure how realistic this assumption is, but it does seem true that intense emotion is more metabolically taxing -- thus explaining why we're close to hedonic neutrality most of the time. The question for the sentience-scaling view is then why evolution would give larger-brained organisms substantially more intense emotions in order to motivate them to do similar sorts of things (eat, mate, avoid predators). Maybe the reply would be that, because of their greater prevalence of neurons, larger organisms simply experience greater emotional intensity for the same level of metabolic cost? Or maybe larger-brained organisms, being perhaps more intelligent, have a larger number of options available to them and so need a larger number of levels of pleasure and pain for motivating a wider range of possible responses?
- If consciousness results from implementing the right algorithm, then maybe it doesn't matter exactly how that algorithm is run? This suggests the notion that consciousness is either on or off, at least for serial computers. As an illustration, there are lots of functions for computing factorial(n) -- some fast, some slow, some simple, some complex -- but whether a given function computes factorial(n) is either true or false. It doesn't depend on lines of code or the computational burden of running the code. (On the other hand, the number of instances of factorial(n) that can be run on a given machine does depend on the latter factor.)
- Certain qualia, such as the redness of red, do seem to be binary -- indeed, that's the whole premise behind David Chalmers's "fading qualia" thought experiment. Might consciousness itself be the same way? This is especially interesting viewed in light of the suggestion that consciousness should be substrate-independent. I'm not sure if qualia can be produced by Turing-machine simulations, but if so, what does "brain size" look like in this context? Of course, we do know that different degrees of pain and pleasure are possible, so this last point may be a red herring.
- On the painkiller point, it may be that fewer pain signals translates into less pain for a given organism. But maybe what matters are the relative amounts of pain signals vs. other signals the brain receives. Larger brains have to process more total inputs. Maybe a tiny brain receiving only a few pain signals feels subjectively worse than a large brain that, while receiving a larger number of pain signals, is distracted by lots of other information. (The gate control theory of pain may be relevant here?)
The question of if and how emotional intensity scales with brain size is, in my view, extremely important, because it affects the urgency we attach to potential
suffering by insects in the wild, assuming they
can feel pain at all. If, for instance, insect suffering is 1000 times less intense than human suffering, then we should discount the
10^18 insects in the world not only by their probability of sentience (say, 0.1, to use a round number) but also by their reduced emotional intensity if sentient (0.001). In that case, there would be effectively 10^14 human-equivalent insects, much closer to the 10^10 humans that exist.
There's a fundamental problem here, though. A good Bayesian will not pick a point estimate for the ratio of insect sentience to human sentience but will maintain a probability distribution over plausible ratios. In view of the uncertainty on this question,
I think it's reasonable to maintain some probability on the hypothesis that insects do suffer about as much as humans. For instance, maybe we assign probability 0.2 to insects being able to suffer at least half as much. But in that case, an upper bound on the "expected ratio" of sentience between insects and humans is 0.1, which implies much more concern for insects than a ratio like 0.001, even if the latter is the value we consider most likely.
Note that if number of neurons devoted to pain processing is the relevant measure, then the disparity between insects and humans will probably be smaller than straight division based on brain mass would suggest, since I assume humans devote a larger fraction of neurons to non-hedonic brain functions. On the other hand, humans are endothermic and, so a friend tells me, have a larger proportion of synapses than insects. Exactly how to incorporate these factors into a weighting even if we do apply one is not obvious.
Edit on 4 Aug. 2013:
My current thoughts on this topic: "
Is Brain Size Morally Relevant?"
Edit on 23 June 2012:
I wrote the following post in May 2009, back when I was confused about consciousness and didn't fully comprehend the reductionist interpretation of it. So forgive some of the language, but most of the content is still sensible and important.
My current position on this topic is as follows. I remain uncertain whether I want to give greater weight to more powerful brains (i.e., ones with greater computational throughput). On the one hand,
Bostrom's thought experiment (discussed further below) is compelling. But on the other hand, I have an intuition that what matters most is a single organism's unified experience, and I might bite the bullet and say that if a single mind is divided into two separate minds, then the moral weight has thereby doubled. After all, there's no law of conservation of consciousness analogous to the law of conservation of mass/energy. If the context changes, by converting one unified conscious actor into two, then our assessment can change.
Let me put it another way. It may be that the amount of neural tissue that responds when I prick my finger is more than the amount that responds when a fruit fly is shocked to death. (I don't know if this is true, but
I have almost a million times more neurons than a fruit fly.) However, the pinprick is trivial to me, but to the fruit fly, the pain that it endures before death comes is the most overwhelming thing in the world. There's a unified quality to the fruit fly's experience that isn't captured merely by number of neurons, or even a more sophisticated notion of computational horsepower of the brain. See also
this discussion about Harry the human and Sam the snail.
Another factor
that may be relevant is the "clock speed" of the brain in question. If smaller animals tend to have higher clock speeds (at least within a given class of animals, like mammals), then other things being equal, they would get more weight in our calculations, although it's not clear how big the effect size would be.
In addition,
if anthropics are not weighted by brain size, then not giving moral weight to brain size begins to seem more plausible as well.
There are many more subtleties and arguments to bear on this tough question, but I shall leave that to the discussion which follows. So without further ado, here's the original post.
Original piece from May 2009:
--------------------------
Several people have suggested to me the idea that capacity for suffering may vary in rough proportion with -- or at least according to some approximately monotonic function of -- brain size. (I'll use "brain size" as a shorthand term referring to the amount of neural tissue an organism has. Perhaps a more relevant measure, though one for which it's harder to find good statistics, is the amount of neural tissue devoted specifically to producing pain emotions, rather than, say, vision processing or planning.) When I first heard this idea, I found it somewhat surprising. Introspectively, consciousness feels like a binary state: I'm aware and feeling emotions now, but I don't do either to any degree when I'm under general anesthesia. I can't recall feeling "more intensely conscious" at one time rather than another, unless you count the groggy state right before falling asleep. On the other hand, personal introspection doesn't prove much, because during my adult life, I've always had the same brain size, so that even if sentience did vary with brain size, I wouldn't know it. (I might be able to notice the difference if my child self had experienced less intense emotions than my adult self. Introspectively this doesn't feel true, but I don't remember my feelings as a very young child well enough to be sure.)
There do seem to be some good arguments for the size-proportionality position.
- As the cortical homunculus notion suggests, regions of the body that are more sensitive (e.g., the hands) have far more neurons devoted to them in the brain. My feelings of touch are much more refined (and somewhat more intense) on my hands than on my back. Presumably the same would be true of pain nerves.
- Painkillers work by reducing the number of pain signals that are produced or the number that actually reach the spinal cord and brain. A smaller number of signals translates into less intense experience.
- We suspect that other abilities of the brain also vary with its size, notably intelligence. That ability is apparently not binary, despite the fact that it feels somewhat binary to any one individual. (I do have a sense of my intelligence varying from moment to moment, but not drastically.)
However, there are some objections.
- If sentience scales with brain size, then would men have more intensity of emotion than women, since their brains are larger? And should we worry less about pain in children than adults? Maybe -- politically unpopular ideas needn't be incorrect.
- Intelligence does not seem to be related to absolute brain size but more to relative brain size. Men and women seem to have basically the same IQs even though men have an extra 4 billion brain cells. Moreover, brain size scales very closely with body size (especially lean body mass), so that in fact, whales have much bigger brains (4000-7000 g) than humans (1300-1700 g). Indeed, the standard metric of interest in intelligence studies is not absolute brain size but brain-to-body mass ratio, or Encephalization Quotient (EQ). Shouldn't we expect a similar sort of trend for emotional intensity?
- In his "Toward Welfare Biology," Yew-Kwang Ng proposes the principle that because hedonic experiences are energetically costly, evolution generally endows organisms with the minimal levels of emotion needed to motivate them to action, perhaps to differential degrees depending on the situation. Ng is an economist rather than a biologist, so I'm not sure how realistic this assumption is, but it does seem true that intense emotion is more metabolically taxing -- thus explaining why we're close to hedonic neutrality most of the time. The question for the sentience-scaling view is then why evolution would give larger-brained organisms substantially more intense emotions in order to motivate them to do similar sorts of things (eat, mate, avoid predators). Maybe the reply would be that, because of their greater prevalence of neurons, larger organisms simply experience greater emotional intensity for the same level of metabolic cost? Or maybe larger-brained organisms, being perhaps more intelligent, have a larger number of options available to them and so need a larger number of levels of pleasure and pain for motivating a wider range of possible responses?
- If consciousness results from implementing the right algorithm, then maybe it doesn't matter exactly how that algorithm is run? This suggests the notion that consciousness is either on or off, at least for serial computers. As an illustration, there are lots of functions for computing factorial(n) -- some fast, some slow, some simple, some complex -- but whether a given function computes factorial(n) is either true or false. It doesn't depend on lines of code or the computational burden of running the code. (On the other hand, the number of instances of factorial(n) that can be run on a given machine does depend on the latter factor.)
- Certain qualia, such as the redness of red, do seem to be binary -- indeed, that's the whole premise behind David Chalmers's "fading qualia" thought experiment. Might consciousness itself be the same way? This is especially interesting viewed in light of the suggestion that consciousness should be substrate-independent. I'm not sure if qualia can be produced by Turing-machine simulations, but if so, what does "brain size" look like in this context? Of course, we do know that different degrees of pain and pleasure are possible, so this last point may be a red herring.
- On the painkiller point, it may be that fewer pain signals translates into less pain for a given organism. But maybe what matters are the relative amounts of pain signals vs. other signals the brain receives. Larger brains have to process more total inputs. Maybe a tiny brain receiving only a few pain signals feels subjectively worse than a large brain that, while receiving a larger number of pain signals, is distracted by lots of other information. (The gate control theory of pain may be relevant here?)
The question of if and how emotional intensity scales with brain size is, in my view, extremely important, because it affects the urgency we attach to potential
suffering by insects in the wild, assuming they
can feel pain at all. If, for instance, insect suffering is 1000 times less intense than human suffering, then we should discount the
10^18 insects in the world not only by their probability of sentience (say, 0.1, to use a round number) but also by their reduced emotional intensity if sentient (0.001). In that case, there would be effectively 10^14 human-equivalent insects, much closer to the 10^10 humans that exist.
There's a fundamental problem here, though. A good Bayesian will not pick a point estimate for the ratio of insect sentience to human sentience but will maintain a probability distribution over plausible ratios. In view of the uncertainty on this question,
I think it's reasonable to maintain some probability on the hypothesis that insects do suffer about as much as humans. For instance, maybe we assign probability 0.2 to insects being able to suffer at least half as much. But in that case, an upper bound on the "expected ratio" of sentience between insects and humans is 0.1, which implies much more concern for insects than a ratio like 0.001, even if the latter is the value we consider most likely.
Note that if number of neurons devoted to pain processing is the relevant measure, then the disparity between insects and humans will probably be smaller than straight division based on brain mass would suggest, since I assume humans devote a larger fraction of neurons to non-hedonic brain functions. On the other hand, humans are endothermic and, so a friend tells me, have a larger proportion of synapses than insects. Exactly how to incorporate these factors into a weighting even if we do apply one is not obvious.