On “Beneficial Lies”

I don’t believe what I’m about to say in the following thought example, but suppose that news of bin Laden’s death restores the economy to as good a state as it’s ever been in. Then suppose that his “death” was actually a lie concocted by economists and politicians because they knew–with reasonable certainty based on seemingly airtight calculations–that this lie would spur economic growth. Now, if there was an instance where desirism’s broad “people generally” statement can be made confidently, this is it. Certainly, “people generally” have reason to promote that which spurs economic growth, right?

If morality can reduce to desires fulfilled and thwarted, then should we praise or condemn the “beneficial lie?”

One might respond by pointing to a higher-order prohibition on lying in general. Yes, “people generally” also have reason to promote honesty–I do not deny that–although, why? Why do people have reason to promote honesty under desirism? Is it not in order to procure desire fulfillment? If I understand it right, there are no categorical imperatives in desirism, that is, no absolute, unconditional requirements. “Dishonesty is always wrong” seems to be an example of categorical imperative, so it seems fair to say that desirists ought to reject “Dishonesty is always wrong” in favor of something more like, “Dishonesty that thwarts more than fulfills desires is always wrong.” Remember, the economists and politicians knew with reasonable certainty based on seemingly airtight calculations that this lie would spur economic growth. So then, if they fulfill many desires and thwart few or none, should we praise or condemn the “beneficial lie?”

36 Comments

  1. woodchuck64 says:

    Remember, the economists and politicians knew with reasonable certainty based on seemingly airtight calculations that this lie would spur economic growth. So then, if they fulfill many desires and thwart few or none, should we praise or condemn the “beneficial lie?”

    The main problem I have is that I can’t see that such a beneficial lie could reasonably be expected to fulfill many desires and thwart few. Sure, it might spur economic growth but it would do so at the risk of severely undermining confidence in the future pronouncements of economists, politicians, which would have to be desire-thwarting to the society as a whole.

    Put another way, it is generally harder to tell a lie that will be believed over long periods of time than it is to tell the corresponding truth. It seems unlikely that manipulating people with lies will generally fulfill their desires more so than offering them truth and letting them use it for themselves. Being exposed as a liar is generally desire-thwarting for you and the people who formerly trusted you. Therefore, telling beneficial lies is probably not worth it as a matter of habit or policy.

  2. Garren says:

    I don’t think Desirism is going to be a clearly articulated stance until this ‘people generally’ phrase is dropped.

  3. lackofcheese says:

    A more commonly used term is the “noble lie” (http://en.wikipedia.org/wiki/Noble_lie).

    I’m not a desirist, but you could still reject the noble lie directly on desirist reasoning by positing that people have a desire to know the truth, which would be thwarted by such a lie.

    I do not agree with a categorical rejection of noble lies (to do otherwise would be silly), but in the real world they tend to fall flat. Firstly, the noble lie offers a false dilemma, where it is assumed that such benefits could not be brought about without lying. Secondly, if the truth comes out, which is very likely in the end, greater harm could come about.

    Personally, I value knowing the truth quite highly, both for its own sake and because an understanding of reality is essential to everything else I value. While I won’t reject the ‘noble lie’ on principle, I haven’t really seen a realistic example of one.

  4. woodchuck64 says:

    Garren,

    I don’t think Desirism is going to be a clearly articulated stance until this ‘people generally’ phrase is dropped.

    Why’s that? It seems to me you have to use ‘generally’ because a body of moral rules that applies specifically to every situation and every unique mix of individual desires would just be too large to teach to kids. Desirism, as I understand it, has to be practically useful and work within our biological mechanisms for experiencing morality; too many rules and exceptions would overwhelm it.

  5. Garren says:

    @woodchuck64,

    Because ‘the current majority has reason to’ doesn’t have the same ring as ‘people generally have reason to.’ I’m suspicious of verbal formulas.

  6. tmp says:

    cl,

    “Why do people have reason to promote honesty under desirism?”

    Because it is useful for OTHER PEOPLE to be thruthful. Promoting honesty is not the same thing as being thruthful yourself; if you are not likely to get caught, and have no personal objection to lying, then of course you should lie if it is useful to you.

    “desirists ought to reject ‘Dishonesty is always wrong'”

    Actually no. A desirist either believes that dishonesty is always wrong, or not. I don’t see any place in Desirism that says that something is “right” or “wrong”, because desirist definition of morality does not include these concepts.

    “thwarts more than fulfills desires”

    This is old Desirism; I believe that Fyfe commented on commonsenseatheism that it did not work(it gives desires intrinsic value) and has been dropped.

    “should we praise or condemn the ‘beneficial lie?”'”

    This depends entirely on how you feel about lying. And how benefical the ‘benefical lie’ will ultimately be when counting all indirect effects.

    The desirist ‘people in general’ seems to have one purpose only: if an agent desires to give moral prescription, then giving it to ‘people in general’ gives maximum audience. You could give advice to ‘that guy who intends to shoot his classmates dead’ and it would not be intrinsically different from giving advice to ‘people in general’ except for much smaller audience and the fact that you may have a personal aversion to enabling mass murder.

  7. Crude says:

    Personally, I value knowing the truth quite highly, both for its own sake and because an understanding of reality is essential to everything else I value.

    What’s the difference between valuing knowing the truth, and valuing the belief that you know the truth? Because really, it seems to me that insofar as desire and knowledge goes, the truth means next to nothing. It’s that belief which does the heavy lifting.

    When a hypothetical case like cl’s is put forward, and it’s pointed out that it’s possible the lie will be exposed, the replier is actually making two unspoken suggestions. 1, “don’t lie.” 2, “don’t get caught lying”.

    Really, apply it to a more mundane situation: “People value skyscrapers.” I reply, “Ah, but a skyscraper may collapse and kill thousands.” Well, did I just shoot down the plan to build skyscrapers on desirism? Or is the natural reply “Yeah, let’s do our best to make sure it doesn’t collapse”? Now, you can kick this over to a likelihood issue – but that’s another way of saying “lie if it’ll result in gains and if you can minimize the risk of exposure adequately”.

    Anyway, desirism to me seems like such obvious crap. Considering that desires not only can change, but can do so from third party intervention (whether subtle or not, emphasis on the ‘not’ given the sort of guys I always see advancing this idea), there’s little interesting about the idea. Though it’s funny to imagine a situation where desirists find the majority of people desire to ship all desirists to Siberia and be done with them.

    And the best part is, so long as either a third party or the desirists themselves manage to convince the desirists to desire freezing weather and vodka, it’d be a win-win.

  8. Rufus says:

    Bernard Williams’ A Critique of Utilitarianism highlights precisely what I think it wrong with any form of utilitarianism.

    Crude is right on the money with the “convincing” aspects of utilitarianism. It is the powerful who set the agenda for the rest of us. Remember that for the desirist there is no intrinsic value to any particular desires or actions. Most desires are malleable. So, I would define powerful any person or group of people who can manipulate a person’s desires such that they desire something other than they would without that manipulation.

    Since no desire or action is intrinsically good or evil, we might imagine something absolutely repulsive like the rape of an small child. The desirist says this is bad only insofar as raping small children tends to thwart more desires than it promotes. The canned desirist response is that we need to “dial down” the rapists desire to rape while dialing up the desire to protect children and the desire, generally, to have an aversion to rape.

    However, if our brains could be manipulated via chemical, electrical, psychological means, or whatever, it is not implausible that the dials could be switched. We could turn the dial down so that parents and child rape victims tend to have little to no aversion to rape while we turn the dial up within society for child rapists to go about their “business” happily. Again, since there is nothing intrinsically wrong with raping a child, we are left merely considering whether or not the dials are set correctly.

    The desirist might note that certain desires are less malleable than others. In fact, I think Fyfe argues that some are fixed and cannot be changed. Really? Does he have evidence for this? What material thing cannot be changed? I submit that this is, at best, a technical problem. In the future, it might be possible to do whatever one likes while victims are happily drugged out or uploaded into a Matrix-like delusion.

    In other words, as we become better at manipulating neurological states, Harris and Fyfe-like ethical arguments become more and more irrelevant. Why shouldn’t a powerful group of people control the “moral-landscape” of the mass of people so that they will not protest when they are physically abused? Because it’s a violation of their freedom? So what! I thought we were concerned with brain-states here!

    Best,

    Rufus

  9. dguller says:

    Rufus:

    >> However, if our brains could be manipulated via chemical, electrical, psychological means, or whatever, it is not implausible that the dials could be switched. We could turn the dial down so that parents and child rape victims tend to have little to no aversion to rape while we turn the dial up within society for child rapists to go about their “business” happily. Again, since there is nothing intrinsically wrong with raping a child, we are left merely considering whether or not the dials are set correctly.

    That is an interesting thought experiment. Say that our neurobiology and wider culture changed sufficiently that forced sexual intercourse with a child resulted in sheer pleasure in the child, no negative developmental consequences, pride in the parents, and actually contributed to increasing the overall well-being of the culture at large.

    The question would then be whether forced sexual intercourse of a child would be immoral. Well, under those circumstances, I would say that it would be moral, because it contributes to the overall flourishing of the child and community. However, under those circumstances, we are no longer talking about human beings as we know them, but imaginary creatures that resemble us in many respects, but who operate according to different moral principles. After all, moral principles are relative to the character and constitution of the agents applying them, and if you change their character and constitution, then you change the morality.

    Does this imply that our moral intuitions are therefore false? No, it does not. It only implies that morality is not an absolute all-or-nothing phenomenon that is applicable to moral agents no matter what the constitution or psychological profile of the moral agents themselves actually is.

  10. Rufus says:

    dguller,

    My mantra is that philosophy tells us the price we have to pay. If you accept the desirist account of morality, then you must accept that forced intercourse with a child is not just possibly morally permissible, but possibly good. Sure, it’s consistent, but completely untoward in my book. I am not willing to pay that price. Perhaps it is because of my moral intuitions, but I think a theory that completely disregards moral intuitions and can, in theory, allow for technological advancements whereby every last one of my moral intuitions is proved false, is likely to be a bad theory.

    If I were to travel to some future or parallel world where, due to technological innovation, my moral intuitions were completely inverted by mastery of brain states, that is a moral universe of which I would want no part. Desirism offers no objection to such a world being brought about.

    Thanks,

    Rufus

  11. dguller says:

    Rufus:

    >> My mantra is that philosophy tells us the price we have to pay. If you accept the desirist account of morality, then you must accept that forced intercourse with a child is not just possibly morally permissible, but possibly good. Sure, it’s consistent, but completely untoward in my book. I am not willing to pay that price. Perhaps it is because of my moral intuitions, but I think a theory that completely disregards moral intuitions and can, in theory, allow for technological advancements whereby every last one of my moral intuitions is proved false, is likely to be a bad theory.

    That is a fair point, but I wonder if you overstate your case a bit.

    Say you came across an alien race that raped their children, but this actually resulted in many positive benefits for the children and no deleterious developmental consequences, and also benefited the community and parents. Furthermore, say that not raping their children would result in developmental delay of the children, and the subsequent instability of their community. Would you harangue them about their immorality, or would you just accept that they are constitutionally different from us, and that therefore our morality would not apply to them, and vice versa?

    >> If I were to travel to some future or parallel world where, due to technological innovation, my moral intuitions were completely inverted by mastery of brain states, that is a moral universe of which I would want no part. Desirism offers no objection to such a world being brought about.

    That is true, but there is a difference between our moral intuitions and the facts about our psychological well-being. I think that your objections are rooted in the idea that our moral intuitions can be radically altered, but without any additional revision in what results in our individual and collective flourishing. So, my moral intuition that raping a child is immoral can be twisted into seeing child rape as a positive and moral act, and I can proceed to rape children, but this would result in significant mental and emotional damage to the children. In that case, I would agree that this is a true perversion, but that is only because my intuitions do not align with facts about human nature.

    However, if human nature itself changed into something more along the lines of my alien race example above, then our intuitions would be in line with our nature, and thus well-being and flourishing would occur, even though they contradict our current intuitions. Of course, we would no longer be what we understand as “human”. So, it all depends upon the nature of the moral agents themselves, and not just on our intuitions.

  12. woodchuck64 says:

    Rufus,

    If you accept the desirist account of morality, then you must accept that forced intercourse with a child is not just possibly morally permissible, but possibly good.

    The sort of drastic neurological changes that would permit this would also make it impossible to use the word “child” as you do above. There would be very little in common with such creatures and human children.

    Sure, it’s consistent, but completely untoward in my book. I am not willing to pay that price. Perhaps it is because of my moral intuitions, but I think a theory that completely disregards moral intuitions and can, in theory, allow for technological advancements whereby every last one of my moral intuitions is proved false, is likely to be a bad theory.

    If moral intuitions evolved to benefit social cohesion, moral intuitions can be expected to change as we change. The preponderance of evidence seems to solidly support an evolutionary origin for moral behavior. It seems unlikely, though, that your moral intuitions will all be completely proven false as it seems unlikely that humans would ever abandon value for social behavior completely.

  13. Rufus says:

    dguller,

    In your alien race counter-analogy, I wonder if there is a significant difference in emphasis. I have been focusing on the idea that a few in power could use technological means to manipulate what constitutes flourishing for the rest of us to better fit with their natural non-manipulated agenda.

    So if I were to travel to this world and discover that this is simply how these aliens would interact with one another given their nature, then maybe I could not mount this objection. But if I were to find out that these aliens had all been drugged by their leader and that naturally without this drug they would behave in much the way I would, i.e. not raping little children, I would think something terrible is going on.

    But let me return to the naturally raping aliens. If I had the power to manipulate their brain-states and completely change their society such that raping little alien children no longer constituted flourishing, why not do it? I would be making my desires conform with what tends to promote other desires. Now I am the terrible leader, right?

    The point is that desirism allows for morality to be dictated and controlled by the powerful. The weak are merely pawns who must be controlled in order for the desired paradigm to be put in place. At our current technological level, it is easier to control and suppress rape, than to give everyone the desire to be raped. So rapists don’t have the ability to assert power to set the paradigm. In the future, this may not be the case. Once technology has advanced to the point where we have complete control over brain-states, morality will be equivalent to the arbitrary desires of the powerful. The Euthyphro dilemma rears its ugly head once again.

    To me, this boils down to the old debate between Socrates and Thrysmachus.

    Best,

    Rufus

  14. cl says:

    Garren,

    I don’t think Desirism is going to be a clearly articulated stance until this ‘people generally’ phrase is dropped.

    I’ve agreed with that for the past two years. Alonzo uses “people generally” as a handy euphemism for, “me and other people who value what I value.”

    Crude,

    And the best part is, so long as either a third party or the desirists themselves manage to convince the desirists to desire freezing weather and vodka, it’d be a win-win.

    LOL! No, Crude, because “people generally” have reasons to promote an aversion to freezing weather and vodka! ;)

    lackofcheese,

    I’m not a desirist, but you could still reject the noble lie directly on desirist reasoning by positing that people have a desire to know the truth, which would be thwarted by such a lie.

    In this case, their desire to know the truth would NOT be thwarted. It would be “fulfilled” in the “subjective desire satisfaction” sense [cf. Episode 14].

    I do not agree with a categorical rejection of noble lies (to do otherwise would be silly), but in the real world they tend to fall flat. Firstly, the noble lie offers a false dilemma, where it is assumed that such benefits could not be brought about without lying. Secondly, if the truth comes out, which is very likely in the end, greater harm could come about.

    Do you have any evidence for those claims? Or, are you just going by gut feeling there?

    Rufus,

    If you accept the desirist account of morality, then you must accept that forced intercourse with a child is not just possibly morally permissible, but possibly good.

    I agree. I’ve tried to make this point on CSA, several times, to no avail. If there are no categorical imperatives, and morality reduces to whether or not desires are fulfilled or thwarted, then any desire can be good, bad, or permissible. I’m surprised–maybe even a bit fearful–that so many people apparently can’t see this flaw.

    The desirist might note that certain desires are less malleable than others. In fact, I think Fyfe argues that some are fixed and cannot be changed. Really? Does he have evidence for this?

    No, and he’s been asked for it on several occasions across multiple blogs by myself, Cyril, and others. He just keeps asserting the same claim, with no evidence. As far as humans are concerned, I’m not convinced that non-malleable desires exist, at all.

    woodchuck64,

    The main problem I have is that I can’t see that such a beneficial lie could reasonably be expected to fulfill many desires and thwart few. Sure, it might spur economic growth but it would do so at the risk of severely undermining confidence in the future pronouncements of economists, politicians, which would have to be desire-thwarting to the society as a whole.

    So? Desirism doesn’t have any clauses about risk, and I’m assuming that this lie would never be revealed.

    It seems unlikely that manipulating people with lies will generally fulfill their desires more so than offering them truth and letting them use it for themselves. Being exposed as a liar is generally desire-thwarting for you and the people who formerly trusted you. Therefore, telling beneficial lies is probably not worth it as a matter of habit or policy.

    Those are your own moral intuitions speaking, though I tend to agree. Desirists are supposed to cast aside their own moral intuitions. Whether or not manipulating people with lies will vary on a case-by-case basis.

    The sort of drastic neurological changes that would permit this would also make it impossible to use the word “child” as you do above. There would be very little in common with such creatures and human children. [to Rufus]

    Perhaps, but the example illustrates an important point: what humans call “morality” apparently cannot reduce to “that which tends to fulfill more than thwart desires,” else we wouldn’t bat an eyelash at the possibility of “bad” things being “good” just because they fulfill more than thwart desires. Fyfe’s central tenet is circular as can be.

  15. cl says:

    tmp,

    Sorry, the spam filter swallowed your comment. I think because your email address contains the string, “spam.”

    Actually no. A desirist either believes that dishonesty is always wrong, or not.

    That’s wrong, unless…

    This is old Desirism; I believe that Fyfe commented on commonsenseatheism that it did not work(it gives desires intrinsic value) and has been dropped.

    Citation? The desirism Wiki still upholds this concept, although, Fyfe has a history of neglect when it comes to squaring up existing versions of the theory.

  16. woodchuck64 says:

    cl,

    So? Desirism doesn’t have any clauses about risk, and I’m assuming that this lie would never be revealed.

    Risk is a statement of probabilities and the probability of desire thwarting should be very much a part of every desirist calculation.

    It seems unlikely that manipulating people with lies will generally fulfill their desires more so than offering them truth and letting them use it for themselves. Being exposed as a liar is generally desire-thwarting for you and the people who formerly trusted you. Therefore, telling beneficial lies is probably not worth it as a matter of habit or policy.

    Those are your own moral intuitions speaking, though I tend to agree. Desirists are supposed to cast aside their own moral intuitions. Whether or not manipulating people with lies will vary on a case-by-case basis.

    No, I’m not using my moral intuition in the statement above (or at least trying not to). I am using a kind of social intuition for what sort of desire thwarting/fulfilling could reasonably be the result of certain actions, and that is of course a fallible approach and far from any sort of rigorous proof. However, as long as my social intuition doesn’t conflict strongly with yours on any of those points, we could be pretty close to agreement.

    [to Rufus]

    Perhaps, but the example illustrates an important point: what humans call “morality” apparently cannot reduce to “that which tends to fulfill more than thwart desires,” else we wouldn’t bat an eyelash at the possibility of “bad” things being “good” just because they fulfill more than thwart desires. Fyfe’s central tenet is circular as can be.

    What I’ve seen so far is that hypothetical situations that are many steps removed from human society are used to demonstrate a possibility of “bad” things being considered “good”. But if those hypothetical situations were standard for human society, couldn’t morality be vastly different as well?

    Desirism is saying that what we call “morality” seems to generally reduce to “that which tends to fulfill more than thwart desires” in current human social interaction, but I don’t think Fyfe or anyone would claim it is identical. If it was, we wouldn’t need desirism, our intuitions would do just fine. Rather, I see Fyfe’s approach as trying to add a rigorous methodology to moral calculations which will then be used, in a hypothetical desirist society, to both override moral intuition as needed and instill new moral intuition in the next generation.

  17. dguller says:

    Rufus:

    >> But let me return to the naturally raping aliens. If I had the power to manipulate their brain-states and completely change their society such that raping little alien children no longer constituted flourishing, why not do it? I would be making my desires conform with what tends to promote other desires. Now I am the terrible leader, right?

    Why would you interfere with their society in such a way? Why do your moral preferences trump their moral intuitions, especially since their intuitions are more fitted to their natures? If this is valid, then it is equally valid for this alien race to change our natures to be more permissive of child rape, too!

    >> The point is that desirism allows for morality to be dictated and controlled by the powerful. The weak are merely pawns who must be controlled in order for the desired paradigm to be put in place.

    Well, divine command theory should also be rejected, because it is also an example of a morality that is “dictated and controlled by the powerful”, i.e. God.

    >> At our current technological level, it is easier to control and suppress rape, than to give everyone the desire to be raped. So rapists don’t have the ability to assert power to set the paradigm. In the future, this may not be the case. Once technology has advanced to the point where we have complete control over brain-states, morality will be equivalent to the arbitrary desires of the powerful. The Euthyphro dilemma rears its ugly head once again.

    That is true. If technology has achieved a state where human nature itself can be changed, then morality will become arbitrary in a way that it currently is not. Now, the real issue in that case is that if everyone’s human nature has changed such that rape has the same emotional impact as the tender embrace of two mutually consenting adults in our time, then why on earth would anyone object to it? I mean, the negative effects of rape are now completely gone, and to object to rape in that context would be equivalent of objecting to mutually consenting adult love in our society. And that is the point. Morality must be intrinsically related to the nature of the moral agents, and that if their nature changes, then the morality that is appropriate to them must change, too.

  18. dguller says:

    cl:

    >> If there are no categorical imperatives, and morality reduces to whether or not desires are fulfilled or thwarted, then any desire can be good, bad, or permissible. I’m surprised–maybe even a bit fearful–that so many people apparently can’t see this flaw.

    But wouldn’t you say that any categorical imperative is relative to a particular group of moral agents. I mean, categorical imperatives that apply to human beings would not necessarily apply to other primates or to alien beings who had a different nature from our own. So, the issue is not the desire per se, but how that desire fits within the wider context of the moral agents’ nature and environment, because that context will determine what desire is good or bad.

  19. cl says:

    woodchuck64,

    …the probability of desire thwarting should be very much a part of every desirist calculation.

    Sure, but we can’t make such probability assessments with any trustworthy degree of reliability, IMO. Can you demonstrate how we can? I mean, who would have thought that the desire to drive automobiles–a desire that “people generally” had strong reason to promote, mind you–would have led to such a mess? Anyways, it’s a moot point. I’m presupposing for my example that there is no possible way of the lie being exposed.

    However, as long as my social intuition doesn’t conflict strongly with yours on any of those points, we could be pretty close to agreement.

    Well yeah but I don’t care whether we agree or not. I’m seeking an justification from those who advise against the beneficial lie in this case, and so far, I see nothing except, “Well it might turn out bad in the end,” but we can apply that to seemingly anything.

    What I’ve seen so far is that hypothetical situations that are many steps removed from human society are used to demonstrate a possibility of “bad” things being considered “good”.

    While I agree that certain thought experiments are, my example here is NOWHERE NEAR far removed from human society.

    Desirism is saying that what we call “morality” seems to generally reduce to “that which tends to fulfill more than thwart desires” in current human social interaction,

    Okay then, so is pornography moral? It fulfills the desires of the ladies who get paid fat cash, it fulfills the desires of those who market said ladies, it fulfills the desires of those who indulge in it… so? Is there any hesitation whatsoever in your response? Per desirism, pornography seems to be quite moral. Or, take it in the other direction: pornography thwarts the desires of a great many people who believe it is immoral. So, which is it, and how do we KNOW?

    …I see Fyfe’s approach as trying to add a rigorous methodology to moral calculations which will then be used,

    No offense, I got nothin’ but love for ya, but get real! Fyfe’s never done so much as a lick of math. Fyfe’s never conducted a single evaluation. Hell, I even gave my best attempt at kickstarting some actual evaluations, here and here. All Fyfe does is shoot from the hip. He takes things that he thinks are wrong, then wraps the theory to fit them. That stuff he talks about spectator sports and television sitcoms is pure, unadultered Tipper Goresque morality, and you know it!

    dguller,

    Well, divine command theory should also be rejected, because it is also an example of a morality that is “dictated and controlled by the powerful”, i.e. God. [to Rufus]

    Not necessarily, because “Powerful” isn’t the delimiting criteria here. There is nothing possibly wrong with–and therefore no rational reason to reject–morality dictated by an omnibenevolent, omniscient God.

    And that is the point. Morality must be intrinsically related to the nature of the moral agents, and that if their nature changes, then the morality that is appropriate to them must change, too. [to Rufus]

    So, are you a sort of moral relativist then? It seems so.

    But wouldn’t you say that any categorical imperative is relative to a particular group of moral agents. I mean, categorical imperatives that apply to human beings would not necessarily apply to other primates or to alien beings who had a different nature from our own. So, the issue is not the desire per se, but how that desire fits within the wider context of the moral agents’ nature and environment, because that context will determine what desire is good or bad.

    As far as whether categorical imperatives also apply to reptiles, I couldn’t tell you. In desirism, the issue is whether or not a given desire tends to fulfill or thwart other desires. If it fulfills more than thwarts, we promote it. If it thwarts more than fulfills, we condemn it. Aside from the fact that a reliable calculation seems impossible, something about that isn’t quite right–although I grant that desires seem inextricable from any productive discussion of morality. I mean, example after example has been given where a desire seems to fulfill more than thwart other desires, yet desirists are always trying to wriggle out of uncomfortable conclusions.

  20. Rufus says:

    dguller,

    Why would you interfere with their society in such a way? Why do your moral preferences trump their moral intuitions, especially since their intuitions are more fitted to their natures? If this is valid, then it is equally valid for this alien race to change our natures to be more permissive of child rape, too!

    Exactly, why, why not? I might do it because I want to live comfortably amongst the aliens and if I have the power to manipulate them into adhering to my values, why not? It’s no more immoral that the scenarios I was positing. My whole point is that you cannot suddenly assert that it is objectively wrong to manipulate the brain-states of a person or people within a society.

    Well, divine command theory should also be rejected, because it is also an example of a morality that is “dictated and controlled by the powerful”, i.e. God.

    That was my point in bringing up Euthyphro. Even if there are adequate responses as cl suggests, lets take the typical atheist line that DCT fails because it makes morality arbitrary. Well, we can’t say that and then suppose that desirism fairs any better. My thought experiment, though fantastic, proves this point.

    That is true. If technology has achieved a state where human nature itself can be changed, then morality will become arbitrary in a way that it currently is not. Now, the real issue in that case is that if everyone’s human nature has changed such that rape has the same emotional impact as the tender embrace of two mutually consenting adults in our time, then why on earth would anyone object to it? I mean, the negative effects of rape are now completely gone, and to object to rape in that context would be equivalent of objecting to mutually consenting adult love in our society. And that is the point. Morality must be intrinsically related to the nature of the moral agents, and that if their nature changes, then the morality that is appropriate to them must change, too

    It’s not that morality would become arbitrary, its that it would be revealed to be arbitrary. That is equivalent, in my mind, to there really being no morality at all.

    These thought experiments may be many steps removed from our current situation, but I am only revealing the arbitrary nature of the “moral” system by simply injecting the system with a little technological power. This is not so implausible, especially for our singulatitarian friends over on CSA.

    Best,

    Rufus

  21. cl says:

    Rufus,

    You started a blog! I shall check it out posthaste.

    I don’t know if it’s what you intended, but I added blockquotes to your last comment, and deleted the comment about lacking blockquotes. Sorry if I took liberty you didn’t want me to, I was just trying to help.

    This is not so implausible, especially for our singulatitarian friends over on CSA.

    LOL!

  22. woodchuck64 says:

    cl,

    Sure, but we can’t make such probability assessments with any trustworthy degree of reliability, IMO.

    True, but I didn’t take this as the central objection of your post. My response to a claim that desirism doesn’t work because you can’t measure desires adequately would be that considerable technological advances in understanding mind/brain is absolutely necessary before we could ever dream of putting desirism on a ballot. But I’m confident those advances will occur, we’re not hitting a wall yet in understanding human behavior.

    Well yeah but I don’t care whether we agree or not. I’m seeking an justification from those who advise against the beneficial lie in this case, and so far, I see nothing except, “Well it might turn out bad in the end,” but we can apply that to seemingly anything.

    Well I’m not using “bad”, but “it will probably turn out to thwart more desires than fulfill generally”. That of course is the question: under desirism is a beneficial lie good or bad? I tried to work out a sense of the issues going into a conclusion.

    While I agree that certain thought experiments are, my example here is NOWHERE NEAR far removed from human society.

    Right, child-rape was the example that I thought was far-removed. In your example, in contrast, I was arguing our moral intuition is more likely in sync with desire calculations, at least according to my social (not moral) intuition.

    Okay then, so is pornography moral? It fulfills the desires of the ladies who get paid fat cash, it fulfills the desires of those who market said ladies, it fulfills the desires of those who indulge in it… so? Is there any hesitation whatsoever in your response? Per desirism, pornography seems to be quite moral. Or, take it in the other direction: pornography thwarts the desires of a great many people who believe it is immoral. So, which is it, and how do we KNOW?

    That’s a tough one, I don’t have it worked out; but most legal-but-yet-somehow-immoral activities fall into these borderline categories. But I don’t see the task impossible in theory, given continued advances in technology and much better understanding of human behavior.

    All Fyfe does is shoot from the hip. He takes things that he thinks are wrong, then wraps the theory to fit them. That stuff he talks about spectator sports and television sitcoms is pure, unadultered Tipper Goresque morality, and you know it!

    I said he is “trying to add a rigorous methodology”, but I didn’t say he was succeeding :) Desirism is very much a work-in-progress.

  23. Rufus says:

    cl,

    Thanks! I am still learning my way around the internets.

    Thanks for checking out the blog… I’m in the midst of finals, so I have not done much with it in the past month, but I have some plans for it in the near future.

    -Dan

  24. cl says:

    woodchuck64,

    Just curious: what’s the difference between social and moral intuition, as you use them?

    …I’m confident those advances will occur…

    Really? You’re confident that one day, we’ll be able to ask, “How many desires will X thwart, vs. how many desires will X fulfill?” — and get a reliable answer? That’s pretty confident! However, since we’re nowhere near that point now, don’t you think Fyfe puts the cart before the horse when he comes out and makes these sweeping judgments about doing away with things, when we haven’t even got anything close to a reliable calculus? It boggles my mind that this doesn’t seem to irk anybody except me, and a few others.

    That of course is the question: under desirism is a beneficial lie good or bad? I tried to work out a sense of the issues going into a conclusion.

    I know, but it seems to me you’re doing the same thing Fyfe does: shooting from the hip and preserving your own intuitions, even though the example was laid out in such a way that guarantees desire fulfillment over desire thwarting. Why not just say, “Sure, lying is good if we can get away with it and desires are more fulfilled than thwarted?”

    That’s a tough one, I don’t have it worked out; but most legal-but-yet-somehow-immoral activities fall into these borderline categories.

    Wait a minute! On the one hand, I respect your reluctance to pontificate, but if you don’t have it worked out, how can you say it’s immoral? Are you just implying that most people think it’s immoral? If so, what is YOUR opinion on pornography, and WHY do you hold it?

    …I don’t see the task impossible in theory, given continued advances in technology and much better understanding of human behavior.

    As a theist who believes in an all-knowing God, I agree that it’s not impossible in theory, but if I were an atheist, I’d say no amount of technology or knowledge of behavior could give us the information we need. We need omniscience to reliably predict the effect of any given desire on the balance of all desires. How can we obtain such knowledge? Wouldn’t we be forever plagued by an element of unpredictability?

    I said he is “trying to add a rigorous methodology”, but I didn’t say he was succeeding :) Desirism is very much a work-in-progress.

    Fair enough, I agree, but I have to wonder: why isn’t anyone trying my approach? It definitely has flaws, but should it be wholly ignored? I mean, if we’re going to make calculations, something like what I’m suggesting seems necessary, doesn’t it? Are those “hierarchy of desires” posts fatally flawed, or otherwise worthy of dismissal?

  25. lackofcheese says:

    cl,

    Do you have any evidence for those claims? Or, are you just going by gut feeling there?

    Yes, there is substantial real-world evidence of the risk of the so-called ‘Noble Lie’. The propaganda machines of the past century count as such.
    Sure, those Noble Lies weren’t really noble, because those who told them were wrong, but I’d say that many of those who propagated them thought that they were indeed acting for the greater good.
    In the real world people make plenty of mistakes, and if you allow a very small group of people to decide what Noble Lies the general populace hears, mistakes become very much more likely.
    Similarly, in the real world, as we fall deeper into the Information Age, it’s simply not easy to keep a secret well.

    As for my point that in general, in the real world, such benefits could be obtained without lying, I’m not claiming that a realistic Noble Lie couldn’t possibly exist, merely that there don’t seem to be many examples. Can you suggest one that would work in the real world?

    Wait a minute! On the one hand, I respect your reluctance to pontificate, but if you don’t have it worked out, how can you say it’s immoral? Are you just implying that most people think it’s immoral? If so, what is YOUR opinion on pornography, and WHY do you hold it?

    I know this question wasn’t directed at me, but I’ll answer it. Pornography is fine, and in fact, ceteris paribus, it is morally good.

    However, it’s clear that there are some negative effects to certain kinds of pornography. In particular, any positive portrayal of rape is harmful, as inures society to the act.
    A lot of pornography at the moment is rather misogynistic, and often unrealistic, both of which are harmful.
    On the whole, though, I think pornography is a boon to humankind.

  26. lackofcheese says:

    As for the discussion about intuitions, morality has to come down to something along the lines of an intuition at some point. You simply cannot fully discard our human notions and expect to say anything of value at all.

    However, this isn’t a license to trust your own intuition without reservation either. Human intuitions are often mistaken – I think one of the best examples is with regards to incest. Sure, it’s icky, but that alone doesn’t make something immoral.

    In order to do a good job thinking about morality, you have to critically reflect on you own brain – but by necessity you must do so from within your own brain.

  27. cl says:

    lackofcheese,

    Yes, there is substantial real-world evidence of the risk of the so-called ‘Noble Lie’. The propaganda machines of the past century count as such.

    Can you point me to some scientific data? It seems like you’re just shooting from the hip. I don’t really have a problem with that per se, but, if that’s the case, we need to be honest.

    …I’m not claiming that a realistic Noble Lie couldn’t possibly exist, merely that there don’t seem to be many examples. Can you suggest one that would work in the real world?

    Can you give a solid argument explaining why the one I provided wouldn’t work?

    I know this question wasn’t directed at me, but I’ll answer it. Pornography is fine, and in fact, ceteris paribus, it is morally good.

    Do you have any evidence for that claim? Or, is this just your opinion?

    As for the discussion about intuitions, morality has to come down to something along the lines of an intuition at some point. You simply cannot fully discard our human notions and expect to say anything of value at all.

    I agree. Do I have any evidence for my agreement? No, but I think a logically airtight argument can be given, along the lines of “value requires a valuer.” Or, as you put it, we’re forced to think about this from within our own brains, which are heavily influenced by emotion and intuition.

  28. tmp says:

    cl,

    Citation?


    http://commonsenseatheism.com/?p=12270

    Alonzo Fyfe:

    I never did hold that desire fulfillment had intrinsic value. However, there was once a time when I thought that even though desire fulfillment has no intrinsic value it behaves as something that does have intrinsic value. I thought it could play the role very similar to the role that happiness or pleasure or preference satisfaction plays in other utilitarian theories.

    I gave up that idea at about the same time I gave up “desire utilitarianism” for “desirism”. In fact, I sometimes think that inventing the term “desire fulfillment” is a mistake.

    This thread has a lot of me being a retard and using some terms wrong. Also, Alonzo gives some explanation to “people in general” in this thread.

  29. tmp says:

    cl,

    I’m extrapolating here, but the old definition of “good desire” seems to be heavily tied to “desire fullfillment works like intrinsic value”. And the old definition of “morally good act” comes directly from “good desire”.

    Also, desirism currently defines morality as a practice between moral agents, so it would seem logical that “moral good” would be somehow involved with that practice(and “good desire” isn’t, not really).

  30. lackofcheese says:

    cl,

    Can you point me to some scientific data? It seems like you’re just shooting from the hip. I don’t really have a problem with that per se, but, if that’s the case, we need to be honest.

    There is a definite historical precedent – Nazi Germany, Communist Russia, etc. – that it’s quite dangerous to allow governments to deceive their people on a grand scale.

    Can you give a solid argument explaining why the one I provided wouldn’t work?

    Well, for a start, there’s the alternative idea of capturing or killing bin Laden rather than just pretending to do so. Besides, you’ve given no reason to think that bin Laden’s death would have that much of an impact on the economy, nor have I read anything to that effect anywhere else. If you want to show that it’s a realistic scenario, the burden of proof is on you.

    Do you have any evidence for that claim? Or, is this just your opinion?

    Pornography brings pleasure, and, apart from the issues I mentioned previously, no one is harmed.

  31. cl says:

    tmp,

    I don’t think the definition of “moral good” has changed. That citation doesn’t contain any indication that it has.

    lackofcheese,

    I’m not claiming that a realistic Noble Lie couldn’t possibly exist, merely that there don’t seem to be many examples. Can you suggest one that would work in the real world?

    It’s been a while, so this thread might be more on the “dead” side, but I thought of another beneficial lie last night, and wanted to get some feedback on it.

    Say I’m a broke business owner in a high-theft area. I don’t have the money for a surveillance system, so I install fake cameras and signs that say, “You are being recorded,” even though nobody is being recorded.

    Is this a morally acceptable lie?

  32. tmp says:

    cl,

    “I don’t think the definition of ‘moral good’ has changed. That citation doesn’t contain any indication that it has.”

    But isn’t the definition of moral good in desire utilitarianism based pretty much entirely on the premise that desires behave like they had intrinsic value. And Alonzo dropped this premise. So either desirism allows for existence of intrinsic value, or the definition has changed. And I’m pretty sure that Alonzo still claims that intrinsic value does not exist. Has everyone missed this, or have I gotten something wrong again?

  33. tmp says:

    cl,

    Also, I had a brief exchange with Alonzo, in which he confirmed that there is no intrinsic reason for an agent to take other peoples’ desires into account. I think. It’s hard to be sure, sometimes.

    So, if the definition of “moral good” has not changed, then we are left with “moral good” whose definition does not actually imply any kind of “moral ought”. Which is pretty useless, really.

  34. cl says:

    tmp,

    I’m pretty sure that Alonzo still claims that intrinsic value does not exist. Has everyone missed this, or have I gotten something wrong again?

    Neither. Alonzo still claims that intrinsic value does not exist.

    …isn’t the definition of moral good in desire utilitarianism based pretty much entirely on the premise that desires behave like they had intrinsic value.

    I don’t think it is. A morally good desire is a desire that tends to fulfill more than thwart the desires of others. An irrational desire is a desire that tends to thwart more than fulfill the desires of the agent. I don’t see how Alonzo’s comments about intrinsic value would change that–but maybe I’m missing something.

    So, if the definition of “moral good” has not changed, then we are left with “moral good” whose definition does not actually imply any kind of “moral ought”. Which is pretty useless, really.

    Exactly. Alonzo’s invoking of the “masked-man fallacy” doesn’t solve the is-ought problem.

  35. lackofcheese says:

    cl,

    Say I’m a broke business owner in a high-theft area. I don’t have the money for a surveillance system, so I install fake cameras and signs that say, “You are being recorded,” even though nobody is being recorded.
    Is this a morally acceptable lie?

    Sounds okay to me.

  36. cl says:

    lackofcheese,

    So then, what becomes of:

    I’m not a desirist, but you could still reject the noble lie directly on desirist reasoning by positing that people have a desire to know the truth, which would be thwarted by such a lie.

    IOW, what, exactly, is the principle that makes this work? When is one allowed to abrogate?

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>