Politics Discussion

Strikingwolf

New Member
Jul 29, 2019
3,709
-26
1
Time to be called an idealistic moron!
Suppose, if you will, a society much like our own, in a future that may be possible but is somewhat far off and unlikely. We have workable cures for nearly anything, and what's remaining (eg cancer) we have workable treatments for with reasonable rates of survival. Now, suppose this (nearly identical) problem (which is ambiguous because I don't want to look up a ton of illnesses):
You have item A, which can be used to save the lives of one of these people:
  1. An otherwise healthy person
  2. A person with illness B, which is debilitating but curable
  3. A person with illness C, which is one of the few illnesses that we do not have an absolute cure for, but takes a while to act and [Person 3] is currently still doing well
Who would you save? (woo new trolley problem, everyone!)[/list]
1
 

Strikingwolf

New Member
Jul 29, 2019
3,709
-26
1
I would rather have a fair possibility for both instead of being based on "inferior" and "superior".
The thing is that it isn't "inferior" and "superior". The only things we would rely on are purely measurable things, making it actual superiority
 
  • Like
Reactions: psp

Strikingwolf

New Member
Jul 29, 2019
3,709
-26
1
The horrible, horrible things you learn about your friends when morality and ethics problems are the topic.
Yeah, it is such a brutal topic. And there isn't really an answer that makes absolute sense to us humans because we are to closely attached. Personally I would let an AI* handle it...
*in a box
I think most of us remember what can happen when people start considering some humans inferior to others. Genocides happen.
That is when it isn't based off things that are measurable and actual superiority. There is a huge difference, and yes this is how genocides can start. Thus my mention of letting an AI handle it
 

trajing

New Member
Jul 29, 2019
3,091
-14
1
Yeah, it is such a brutal topic. And there isn't really an answer that makes absolute sense to us humans because we are to closely attached. Personally I would let an AI* handle it...
*in a box

That is when it isn't based off things that are measurable and actual superiority. There is a huge difference, and yes this is how genocides can start. Thus my mention of letting an AI handle it
Oh, on to more hypothetical situations, are we? Absolute logic != no morality. There is no fundamental tradeoff.
 

Strikingwolf

New Member
Jul 29, 2019
3,709
-26
1
Oh, on to more hypothetical situations, are we? Absolute logic != no morality. There is no fundamental tradeoff.
Well true. Although absolute logic is probably always going to be the best situation. On a related topic at our cores humans are absolute logic. So back to the drawing board...
 

Strikingwolf

New Member
Jul 29, 2019
3,709
-26
1
Maybe we should execute those babies at birth instead of wasting public money on helping them through life? That's what the numbers say.
We are talking about the point at which they are going to die, not their birth. Wasting public money is fine, money is worthless in the grand scheme of things
An AI would be one step closer to genocide.
How? It wouldn't want to execute us all, just make decisions about who should get the "kidney"

Also, just as a response to twitter. I am purely playing devil's advocate here. I couldn't make any of these decisions IRL and I don't expect anyone to. I also think the very concept is disgusting. But if we look at it without our human emotions then it becomes clear what should happen

Oh and, you do realize that this already happens all the time
 
  • Like
Reactions: psp

trajing

New Member
Jul 29, 2019
3,091
-14
1
Also, just as a response to twitter. I am purely playing devil's advocate here. I couldn't make any of these decisions IRL and I don't expect anyone to. I also think the very concept is disgusting. But if we look at it without our human emotions then it becomes clear what should happen
Wait, there's a discussion going on about this in Twitter?
 

Strikingwolf

New Member
Jul 29, 2019
3,709
-26
1
You are reducing human lives to numbers. Attitudes like that are how bad things start.

And yes, I know it happens, but that does in no way mean I support it.

Now, we could also value people by their bank account. Maybe I should be prioritised on cost of Bob, since I'm wealthy? How would Bob feel about that?

And don't say that feelings aren't important. Feelings separate us from robots. If we don't have feelings, what stops us from mass murdering "inferior" people?
humans are numbers. Everything is numbers. Also, I don't really think feelings are important and think robots would be a better species than us...oh, and we are robots, our brain's are big computers.
 

trajing

New Member
Jul 29, 2019
3,091
-14
1
You are reducing human lives to numbers. Attitudes like that are how bad things start.

And yes, I know it happens, but that does in no way mean I support it.

Now, we could also value people by their bank account. Maybe I should be prioritised on cost of Bob, since I'm wealthy? How would Bob feel about that?

And don't say that feelings aren't important. Feelings separate us from robots. If we don't have feelings, what stops us from mass murdering "inferior" people?
In theory, an AI can have feelings.
 

psp

New Member
Jul 29, 2019
617
-9
1
I know it happens, but that does in no way mean I support it.

You are reducing human lives to numbers. Attitudes like that are how bad things start.
Now, we could also value people by their bank account. Maybe I should be prioritised on cost of Bob, since I'm wealthy? How would Bob feel about that?
And don't say that feelings aren't important. Feelings separate us from robots. If we don't have feelings, what stops us from mass murdering "inferior" people?
You are trying to make this argument into something it is not. It is not how rich or poor you are, nor about superior or inferior.
Think of it in the simplest possible terms, then we can go from there.
Got one kidney, save a guy who us healthy, or save a guy who will most likely die before 40. (Whatever the number is)
Who would you save?
 

trajing

New Member
Jul 29, 2019
3,091
-14
1
As for the (probably now buried) "You're just using a proxy to make you feel better" argument, why does that matter? In the end, all that matters is how you feel. I know that I would much rather just flip a coin than have to choose between choice A and choice B, both of which involve someone dying. That is what my answer to the trolley problem is. I flip a coin. Heads I pull the lever, Tails I don't.
 

Strikingwolf

New Member
Jul 29, 2019
3,709
-26
1
If I'm more "valuable" than you, what stops me from killing you? Human worth is a horrible concept.
The fact that I can still contribute to society. We are talking about deciding between two people where one is going to live longer, nothing more nothing less. Oh and you do see that we are all AIs :D
 

Strikingwolf

New Member
Jul 29, 2019
3,709
-26
1
As for the (probably now buried) "You're just using a proxy to make you feel better" argument, why does that matter? In the end, all that matters is how you feel. I know that I would much rather just flip a coin than have to choose between choice A and choice B, both of which involve someone dying. That is what my answer to the trolley problem is. I flip a coin. Heads I pull the lever, Tails I don't.
That is probably what I would do, I couldn't hold it over my head, whatever the logical decision is
 

Strikingwolf

New Member
Jul 29, 2019
3,709
-26
1
This is somewhat true. Your decision-making capabilities are laces with emotion to discourage making the "wrong" choice, however this could be considered subconscious logic.
Which it is :p

Yes, the universe works on logic. We work on logic. Everything is logic...
 

psp

New Member
Jul 29, 2019
617
-9
1
That's the same thing. You're basically making one person worth more than another.
My best friend is worth more than someone random, off the street. To me, yes. To that random guys best friend, no.
It really is a matter of ideology. If I went back in time and something happened, and I had to choose to save Einstein or Piccaso, I would choose Einstein as I value science over art. It all comes down to semantics.
We are humans. We are always putting value on everyone and everything around us. It's how we work. We are programmed to hold ourselves better than our peers. Placing worth on human value is bad, as you said, It can and has lead to horrible things. Then again, we make choices every day. For better or worse, perhaps we should become more logical? But then, we wouldn't be humans.

Choose randomly are choose logically. It's your choice in this hypothetical situation. Regardless, if you were faced with this situation, would you choose cold, hard logic, or numbers. Either way is wrong by your ideals. Again, it's all a matter of our life experiences, those little bits of proteins and amino acids that form memories in our brains...