If we want to know whether a certain policy or legal change would be a good idea, we should presumably consider the expected consequences. Very roughly speaking (abstracting away from uncertainty and hence the need to weight various possible options by their probability), we should carefully assess what would happen if the policy is / isn't instituted, and we can then assess which result is morally better. (I don't mean to assume consequentialism here -- if a policy violates rights, for example, one might deem it morally "worse" on those grounds.) Call this approach "thinking realistically about policy outcomes." (Also known as "thinking like an economist.")
It's surprising rare (excepting economists, of course). Most people seem to just think about the most salient aspect of a proposed policy, and form a positive or negative impression based on that. E.g., (1) Americans love the mortgage tax deduction -- it helps you to afford a house! Never mind what a regressive policy this is, or how it distorts the housing market. (2) The Farm Bill's agricultural subsidies help hard-working farmers! Again, never mind that it's opposed by basically every economist from across the political spectrum as wasteful, distortionary, and anti-competitive (harming the global poor). (3) A revenue-neutral gasoline tax would cost me at the pump! Never mind that it would help disincentivize harmful pollution, congestion, etc., and reward those who use less.
But this lack of realistic thinking doesn't just affect the unreflective "folk". Academics too -- perhaps especially the further left-leaning ones -- commonly seem to echo this mistake, albeit in more sophisticated ways. A few examples spring to mind: (1) Bioethicists regularly neglect the "unseen harms" of over-regulation. (2) My previous post on sex selection highlighted the fallacy of moving from moral qualms about people's having certain preferences to the policy conclusion that acting on such preferences should be banned. The first comment responding to my post then repeated the very same mistake! (3) A Facebook friend suggested that kidney markets are bad because they create incentives for the rich to keep people poor (on the off chance that they one day need an organ, and so could get it for slightly cheaper, I guess?). This was suggested by a very intelligent philosopher (and 'liked' by another). But is there any realistic story to tell of how this "incentive" could be anything other than completely negligible in magnitude? I worry that a certain style of ideological thought seems to be substituting for carefully considered empirical predictions about the particular situation at hand.
These various examples are all quite different from one another. But they all serve to illustrate how easily we can be led to (possibly misguided) policy conclusions by methods other than thinking carefully about the likely outcomes of policies. I find the ubiquity of this tendency both surprising and worrying. Anyway, it seems useful to identify the general phenomenon, to more easily guard against it in future.
One should be careful about taking these stated positions to reflect what policy they would really choose if it were really up to them. As I understand various studies have shown that giving people one vote out of a vast number often yields different choices than giving people dictatorial control over the outcome. This makes a great deal of sense when you consider the primary incentive for taking positions on policy matters is to affiliate yourself with one group or another and advertise the kind of person you are.
ReplyDeleteI suggest that one thing all your examples have in common is that doing the right thing is directly and immediately distasteful while the wrong thing only results in distasteful effects indirectly. Refusing to do anything to stop biotech abuses or immoral choices about babies or endorsing the monetazation of life/organs and people don't like to think of themselves that way. On the other hand you can simply choose not to dwell on the indirect effects or assure yourself that surely we can avoid that effect some other way and avoid the mental cost.