![]() |
Utilitarianism
Hey there banterers. Philosophy is part of the title of this forum, so I thought it might be interesting to talk some moral philosophy. Are there any utilitarians among you?
Out of the moral philosophies, it is the one that has resonated the most with me. If you don't know what it is, it is basically a moral theory that says actions that result in increased happiness and/or reduced pain/suffering are morally preferable. There are various formulations of this principle, but generally utilitarianism is more concerned with good consequences (in terms of happiness/pain) than it is with specific dos and donts. A couple of caveats for myself:
Many criticize utilitarianism, but I believe we often follow it when push comes to shove. For example, we might not like the idea of placing different worth on the lives of people, yet we might still prefer the death of a sick old person to one young and healthy. So whatcha think? |
Quote:
Every conceivable context is hard to account for. However, I do think utilitarianism is better equipped than many moral theories considering it's often a little unclear exactly how to achieve the wanted results. It tells you what to aim for, but not how to achieve it. |
It's pretty uncontroversial. That's what makes it such a great tool for propaganda, since heinous acts can be misframed as the utilitarian alternative to a hypothetical worse situation that's allegedly being prevented (nuking Japan's an obvious example of this in my mind). It's still a strong baseline for the critical mind that few people truly reject, but in practice, it's applied retroactively to justify actions as opposed to predicating them.
Your example generally holds but I think that when push comes to shove, people view things in a more shortsighted way that runs contrary to utilitarianism. For example, if the older person was a relative, they might prefer the short term emotional utility that they provide to them versus the longer term and more widespread social/labour utility that the younger person would provide. |
Quote:
Quote:
|
Yeah I'm more of a negative utilitarist (minimising unhappiness seems better to me than maximising happiness) but I'd call myself a utilitarist.
Also in its pure form its virtually impossible to apply, which leaves a lot of freedom which I think is part of the reason it's abused so much The only possible way of applying it is to generalise somehow since you can't look into the future and judge outcomes case by case. I think the best way is not to base rules on it (with laws like 'stealing is bad' there are too many exceptions) but evaluate abstract concepts, like empathy, jealousy, etc. and go from there, which requires a lot of honest introspection. But I guess you need that for any kind of moral compass |
I don't think people are going to en masse act in a utilitarian manner in their day-to-day lives. At least not consciously and consistently.
To me, utilitarianism's big potential is I think it's a good value and goal for society to pursue. A utilitarian principle, like the one posed in my first post, should be used as a guide to figure out best politics / legislation. F.ex. whenever a new law or national road project or whatever is up for discussion, whether or not it is going to, in some manner, add to the (long term) quality of life for people should be the ultimate thing to consider. |
Quote:
|
yeah that sounds like just a pretentious way of saying what I thought you said. Instead of flipping the order of things around you're thinking in circles, because how do you decide what is ethical?
|
Quote:
|
Quote:
I don't agree with you though. Like freedom doesn't have inherent value despite the outcome. . necessary limits are always put on freedom because of the bad outcomes that unmitigated freedom can cause As for truth... See the Jews hiding in your attick and a Nazi asking you about them example Everything is a means to and end imo But none of these moral systems work perfectly cause they are all ad hoc rationalizations for an inner sense of morality that is ultimately more instinctual and less strictly rational... Imo |
Quote:
|
Quote:
Also, the whole idea of an ethical system like this is that you can (in theory) determine which outcomes are desirable without ethics, in this case by somehow measuring people's happiness. No ethics are coming in there |
Quote:
It seems like the only thing you could say it's cause it's more fair or something which in itself is a sort of good outcome Like any of those values if you drill down to the core... Freedom, truth, democracy etc are only valued because they're implicitly associated with the utility they provide |
Omg the whole point is that you develop ethics from utilitarian principles so that you don't have to go through that calculation. That ethics system is developed ages ago and we inherit it with our culture, so of course it feels like that. That doesn't mean there isn't anything like utilitarism at the root of it
Quote:
Quote:
|
Quote:
Is it not something what every human being feels generaly regardless of how/where she/he develops? |
Quote:
Quote:
Then, so you don't think I'm a complete idiot, I should add that there's of course morals derived from culture on top of that which can potentially attempt to reprogram our base morals, like what might happen in very religious environments like a cult. Generally, everyday human interactions is something we've adapted to by natural selection. For good or bad, this (along with culture/experience) does equip us with a knee-jerk sense of morality which is what most of us operate on in our daily lives. However, evolution has not necessarily equipped us with a way to figure out big issue stuff like politics, so that's where I think moral theory is valuable. Quote:
So we know why humans have morals. We know what naturally selected morals are attempting to achieve, which is roughly speaking all those social interactions that has historically let us proliferate our genes into the future. Each person instinctually knows what these things are. Humans are social animals, so we want to have positive and meaningful relations with others. Humans are programmed to act in a way that leads to reproduction, so we want sex. We are programmed to avoid pain, so we want to be healthy and not suffer. And so on. Normal, healthy people want these things, consciously or not. We form societies and cooperate to better achieve them. In utilitarianism, happiness to me is just the simplest way to represent these things that we naturally want. Getting them satisfies our natures and so makes us happy. However, it's not perfect because we also want some things that we don't need. Like some of us want heroin. Hence, I like to sometimes add a time perspective (long term) because I think that tends to distill utilitarianism a little more into what really matters as described above. I personally could just go for a more long-winded principle (my own version of utilitarianism), but I see the value of making it simple ("happiness"). Something which is nice with a consequence-based moral theory is that it can be empirically tested. Let's say you study life satisfaction compared to income and you find good evidence that satisfaction rises until a household earns 150 000 USD, but then the curve flattens out or even becomes negative. You could use that information to try to make a society which makes 10 households earn 150 000 instead of one household that makes 1 500 000 and 9 that makes nothing. That's an example of how I think utilitarianism should be used. Quote:
A slight side note, I also like the some of the ideas of social contracts as I find them quite descriptive. A society is a bunch of humans getting together. In order for everyone to do better, they agree to abandon certain freedoms. For example, everyone can on average can do better if everyone agrees not to murder and not to steal from eachother. The goal of society is to raise up those who adhere to the social contract. That is done through cooperation and also the removal of the freedoms that would put otherwise everyone down. |
Quote:
So if you say we have a visceral reaction to unfairness that's no difference from saying we have a visceral reaction to suffering.... Both of these can be framed as "bad outcomes" and once again fed into a sorta question of utility Like the deeper question is why do we have visceral reactions to these things... And I think that essentially it's because they're implicitly associated with negative outcomes Like you said before even these concepts you say have inherent value don't have infinite value... I would argue the extent to which they don't have infinite value is the extent to which they can lead to bad outcomes when left completely unrestricted. Like the reason that the inherent value of truth breaks down with regard to an example like the Jews in the attick... The only reason it breaks down is because the outcome it leads to there is so obviously negative. Or do you have some other explanation...? |
Quote:
Atoms don't really have shells with small ball electrons flying around them like planets. But it's a simple model that we "get". Similarly, breaking laws don't always lead to unwanted consequences, only usually, which is why we also tend to operate with judge and jury to assess severity of crimes. We do care about consequence, but we make simpler rules, virtues and explanations because it is practical. It's easier to communicate. About natural selection and genes to describe human behaviour, we wouldn't exist without our genes. Their blueprints make our bodies including our brains, so the idea that you can separate behaviour from genes is.. kinda religious, I guess? For all the evidence we have, genes do matter. edit: Quote:
I would also say the example is in the realm of knee-jerk morals and so isn't necessarily something that needs to be tackled by moral theory or something that moral theory needs to deal with. |
Yeah ok you can say that but that seems like a different way of phrasing the same thing
The value of truth is overridden but the otherwise negative consequences it would cause Why would you wanna know your wife is cheating if it doesn't lead to better outcomes? I get the instinctual response is to say people would prefer to know such a thing but I think there's an implicit logic as to why which generally once again boils down to striving for better outcomes.. like people would say stuff like honesty in a relationship is paramount to a successful relationship etc... It's not so easy to actually divorce any of these values from either certain outcomes or at least the perception of pursuing certain outcomes. |
Quote:
So far I'm hearing from you that we just value some things cause we value them... Which seems pretty unsatisfying. You don't think there's any underlying logic at play? |
Upon reflection, the difficult question of utilitarianism (for me) isn't happiness, but why it should be maximised for all or wherever it potentially exists instead of for just a few.
I think you could couple it with evolutionary biology or other philosophies to figure that bit out, f.ex. the aforementioned social contract. |
Quote:
I don't have time to read the rest of the discussion, I may get back to it later |
Quote:
|
Quote:
I also think that's it's easily achievable. |
They established that I'm a utilitarian then said I have to play with unrealistic hypotheticals because it's what utilitarians do. I don't like dumb hypotheticals.
One example was if you should kill somebody that doesn't want to die if I know that death would cause less suffering for them than letting them live. Or something like that. I said imposing on somebody's choice will cause an unnecessary suffering and also the psychological suffering it would cause me, personally, wouldn't justify it. It was a stupid hypothetical that has no practical use in the real world. |
Quote:
|
Quote:
I.e. there's probably a reason most people would value freedom rather than slavery, truth rather than deception, etc Like you say democracy is self justified. What is it about democracy that makes it self justified as opposed to tyranny being self justified instead? |
Quote:
I.e. framing it as an attempt to maximize well being and reduce suffering. |
Quote:
|
And there we have what I suspected: the ethics you proclaim to follow are basically aesthetics. I guess that's your intention though
|
we do appeal to a higher morality. but that higher morality is based on principles which are fundamentally utilitarian. cruel totalitarianism + lack of freedom = less happiness -> freedom and democracy = good -> Chinese system = bad
|
Quote:
Edit: Anything that requires too much assumption is a **** moral hypothetical in the first place. It's exactly what Frown was describing when saying the ideology is easily manipulatable for people that are short sighted. Lets assume people are just as happy even if I take their freedoms. There for it's okay to take their freedoms. Let's assume somebody will suffer less if I kill them. There for it's okay to kill them. |
Quote:
Quote:
and yes, the freedom = good idea is based on the idea that it generally leads to more happiness (or rather less unhappiness, which is why I favour that approach). Of course there are exceptions and you can't judge for sure most of the time, which is exactly why such a general idea is necessary. You're not doing a calculation at any point because that idea is preposterous: instead you create an ethical system with values like you describe, but as long as these are ultimately based on the idea that humankind is happier/suffers less that way, it's essentially utilitarian |
Quote:
|
I should add one qualifier: the concept of justice may be the only exception where people believe in it because of an instinctive conviction, but I'm not sure it even applies there
|
if I believed reducing freedom would lead to happiness on a really fundamental level then yes. not necessarily always though
|
Quote:
There's literally no point in questioning the ideology with a hypothetical that has no real world example you could ever give. You're literally just saying, "Well what if the floor was lava!" Well, yeah, of course I'd try to avoid the floor if it was lava but it's not lava so who gives a ****? |
Quote:
|
Quote:
Wow so thoughtful. Much deep. How can anybody muster the brain power to answer such a profound question? |
Quote:
|
All times are GMT -6. The time now is 11:46 PM. |
© 2003-2025 Advameg, Inc.