The Junk Drawer

cannabineer

Ursus marijanus
we have a heat pump, but we rent, so i have no idea what it uses as a refrigerant. it's in an akward spot to get to, so i've never looked.
I do know anhydrous ammonia will suffocate you if you get in a cloud of it, they built an anhydrous plant close to the town i grew up in in Mn., and twice in 20 years they've woken people up in the middle of the night because of possible leaks with the winds blowing toward town.
Not a horrible average, but if they miss one event, that's all it will take.
when I worked in the bay area, we evacuated a few times because a local creamery blew its ammonia loop. They shut it down after a short while.
 

DIY-HP-LED

Well-Known Member
Seventy years ago, the United States received news about Stalin's death through a coded message intercepted by a 21-year-old Air Force Staff Sergeant. This individual happened to be the renowned singer/songwriter, Johnny Cash. Back in 1950, at the young age of 18, Johnny Cash joined the U.S. military and was later stationed in Landsberg, West Germany for a three-year tour. Landsberg served as an important outpost during the outbreak of the Cold War that followed World War II, confronting Soviet aggression. Due to Cash's exceptional proficiency in deciphering Morse Code, he was assigned a significant role at his post in Landsberg, where he monitored Soviet communications. On March 5th, 1953, while on duty, Staff Sgt. Cash transcribed a crucial communication from the Russians. This communication held great importance as it revealed the deteriorating health of Soviet Leader Joseph Stalin. Given Stalin's position as the head of the Soviet Empire, his well-being was of utmost concern to the United States intelligence community and other Western powers. By relaying this information to his superiors, Cash became the first American to learn of the Soviet supreme leader's death, marking a significant moment in history.


Image
 

Roger A. Shrubber

Well-Known Member

Roger A. Shrubber

Well-Known Member
Artificial Intransigence

i don't understand why they have to give the AI motivation. It's not a real person, just tell it what the fuck to do and don't give it any options.
If they can't do that, then the fucking shit is useless, and needs a LOT more work.
 

cannabineer

Ursus marijanus
i don't understand why they have to give the AI motivation. It's not a real person, just tell it what the fuck to do and don't give it any options.
If they can't do that, then the fucking shit is useless, and needs a LOT more work.
To me the takeaway is that AI is fundamentally amoral. The given task is prime, and the “laws of robotics” are a nice conceit.

(add) AI, so long as we don’t do a Skynet, is perfectly positioned to provide a control against the strong human tendency to treat morality as axiom. It just ain’t.
 
Last edited:

Roger A. Shrubber

Well-Known Member
the strong human tendency to treat morality as axiom. It just ain’t.
So we need a machine to make it true? If the only way to make morality axiomatic is to have it enforced by an artificial intelligence, then it's still not true...
Same problem i have with religion. Why does doing what you should do demand a reward? why don't you do what you should do, because you would expect the same treatment from other people? If the only reason you behave is to get a cookie, then what would you have done, without the promise of a cookie?
If we don't have AI pointing us towards morality, would we ever be moral because we all should be?
 

GenericEnigma

Well-Known Member
To me the takeaway is that AI is fundamentally amoral. The given task is prime, and the “laws of robotics” are a nice conceit.

(add) AI, so long as we don’t do a Skynet, is perfectly positioned to provide a control against the strong human tendency to treat morality as axiom. It just ain’t.
Agreed. I've been trying to wrap my mind around this.

It's striking how unhesitatingly decisive and committed AI actions are. There is no appeal to morality, just what is possible and what is not. Like trying to fully understand the expanse of time behind evolution, it does not fit into our programming. Generally speaking, such behavior is antithetical to humans, and as such comes across as unpredictable.

However, it's coldly logical. And frightening.

Ever see the movie "Red Planet" with Val Kilmer? Where the robot assistant goes rogue? I want to watch that again to see if I can determine how much the writers injected morality into that robot's behavior. I might not succeed in my determination.
 

cannabineer

Ursus marijanus
So we need a machine to make it true? If the only way to make morality axiomatic is to have it enforced by an artificial intelligence, then it's still not true...
Same problem i have with religion. Why does doing what you should do demand a reward? why don't you do what you should do, because you would expect the same treatment from other people? If the only reason you behave is to get a cookie, then what would you have done, without the promise of a cookie?
If we don't have AI pointing us towards morality, would we ever be moral because we all should be?
That’s not what I meant by “providing a control”. I was thinking in terms of test vs. control in an experiment. AI isn’t marked by our history of a coupla billion years in a food chain.
 

Roger A. Shrubber

Well-Known Member
That’s not what I meant by “providing a control”. I was thinking in terms of test vs. control in an experiment. AI isn’t marked by our history of a coupla billion years in a food chain.
Morality is a subjective social construct, and hasn't remained static throughout the centuries.
The motivations of being a part of the food chain are, in a large part, what shaped our present moral standards.
How is comparison to an artificial intelligence who has never known fear, hunger, anger, depression, love,hope, hate....going to be enlightening?
I'm not trying to be obtuse, it just seems like comparing apples and lock washers.
 

cannabineer

Ursus marijanus
Morality is a subjective social construct, and hasn't remained static throughout the centuries.
The motivations of being a part of the food chain are, in a large part, what shaped our present moral standards.
How is comparison to an artificial intelligence who has never known fear, hunger, anger, depression, love,hope, hate....going to be enlightening?
I'm not trying to be obtuse, it just seems like comparing apples and lock washers.
I’m blue-skying. I’m trying to get a grip on the quasi-cognitive premises of the AI in question.

It is a sort of mind (or at this stage a coarse simulation of mind) that came to be in a new manner, so the “rules” from evolution that shape our minds don’t apply.
 

DIY-HP-LED

Well-Known Member
Morality is a subjective social construct, and hasn't remained static throughout the centuries.
The motivations of being a part of the food chain are, in a large part, what shaped our present moral standards.
How is comparison to an artificial intelligence who has never known fear, hunger, anger, depression, love,hope, hate....going to be enlightening?
I'm not trying to be obtuse, it just seems like comparing apples and lock washers.
Morality and ethics have a biological basis and are essential for human survival. We survive as communities and not so much as individuals because we wouldn't last for long as individuals for the vast majority of our evolution. Our problem is our technology and resulting large scale social structures are making it hard for our small scale tribal social evolution to adapt to. People who were the those recently tribal or who live rural tribal lives have different instinctive values and priorities than those who live urban lives. Some of us, including northern Europeans came out of tribal existences just a couple of thousand years ago. Whether we live, rural tribal or urban civil, we form hierarchical caring sharing communities of mutual support and defense.
 
Top