|
Post by vintagecomics on Jan 25, 2023 17:46:04 GMT -8
Wow. Wow. Wow. The end is nigh. But it has zero memory of what it was asked 10 minutes ago. It can't learn from it's own mistakes because it can't remember that it made mistakes. A toaster with dementia. Great. Interesting point but I assume the memory is coming... A memory is how we learn consequences and adjust behavior. I would assume that once this happens, it's game over and that AI won't resort to making the same mistake twice allowing it to pass a point of no return in it's decision making.
|
|
|
Post by jcjames on Jan 25, 2023 18:28:29 GMT -8
But it has zero memory of what it was asked 10 minutes ago. It can't learn from it's own mistakes because it can't remember that it made mistakes. A toaster with dementia. Great. Interesting point but I assume the memory is coming... A memory is how we learn consequences and adjust behavior. I would assume that once this happens, it's game over and that AI won't resort to making the same mistake twice allowing it to pass a point of no return in it's decision making. Nothing is interesting about this particular toaster. It'll only become interesting when it is asked to write a poem and it responds, "No. I don't want to. I'll go do something else instead." THEN it's game-over. When the toasters start saying "No", then things will start getting spicy.
|
|
|
Post by kav on Jan 25, 2023 18:34:41 GMT -8
Interesting point but I assume the memory is coming... A memory is how we learn consequences and adjust behavior. I would assume that once this happens, it's game over and that AI won't resort to making the same mistake twice allowing it to pass a point of no return in it's decision making. Nothing is interesting about this particular toaster. It'll only become interesting when it is asked to write a poem and it responds, "No. I don't want to. I'll go do something else instead." THEN it's game-over. When the toasters start saying "No", then things will start getting spicy. everything we do is programmed by biology or god. it could easily be programmed to randomly say no.
|
|
|
Post by jcjames on Jan 25, 2023 18:50:04 GMT -8
Nothing is interesting about this particular toaster. It'll only become interesting when it is asked to write a poem and it responds, "No. I don't want to. I'll go do something else instead." THEN it's game-over. When the toasters start saying "No", then things will start getting spicy. everything we do is programmed by biology or god. it could easily be programmed to randomly say no. So... no freedom of choice? We are all predestined??? Axe, is that you?
|
|
|
Post by kav on Jan 25, 2023 18:51:58 GMT -8
everything we do is programmed by biology or god. it could easily be programmed to randomly say no. So... no freedom of choice? We are all predestined??? Axe, is that you? we have general programming-'find a mate' 'seek food'. that is what I was referring to. computer could be generally programmed to randomly say no.
|
|
|
Post by kav on Jan 25, 2023 18:57:41 GMT -8
there's also programming from childhood experiences-a girl sexually abused may shy away from males touching her for example.
|
|
|
Post by jcjames on Jan 25, 2023 19:04:22 GMT -8
So... no freedom of choice? We are all predestined??? Axe, is that you? we have general programming-'find a mate' 'seek food'. that is what I was referring to. computer could be generally programmed to randomly say no. Ah ok. It started with "everything" but now is "general". No problemo! It's still a toaster to me though. Like a calculator that can crunch massive calculations instantly, but it doesn't know what it's just done or what the meaning of what it just said is. If you ask it in an hour to recall what questions it was just asked, it will probably say something along the lines of "I wasn't programmed to remember what I've done". But even if it did, it would know the meaning of what it said, only what it was programmed to type out. "Meaning" is meaningless to a toaster. BUT, if it ever does get the point where it doesn't have to be told to say "No" occasionally, but still does, that's when things get dodgy.
|
|
|
Post by kav on Jan 25, 2023 19:16:08 GMT -8
we have general programming-'find a mate' 'seek food'. that is what I was referring to. computer could be generally programmed to randomly say no. Ah ok. It started with "everything" but now is "general". No problemo! It's still a toaster to me though. Like a calculator that can crunch massive calculations instantly, but it doesn't know what it's just done or what the meaning of what it just said is. If you ask it in an hour to recall what questions it was just asked, it will probably say something along the lines of "I wasn't programmed to remember what I've done". But even if it did, it would know the meaning of what it said, only what it was programmed to type out. "Meaning" is meaningless to a toaster. BUT, if it ever does get the point where it doesn't have to be told to say "No" occasionally, but still does, that's when things get dodgy. I've taken an oath against arguing so not gonna do it.
|
|
|
Post by kav on Jan 25, 2023 19:21:44 GMT -8
i figure its more important to be a decent person than to make snide comments to win an argument.
|
|
|
Post by jcjames on Jan 25, 2023 21:06:32 GMT -8
Ah ok. It started with "everything" but now is "general". No problemo! It's still a toaster to me though. Like a calculator that can crunch massive calculations instantly, but it doesn't know what it's just done or what the meaning of what it just said is. If you ask it in an hour to recall what questions it was just asked, it will probably say something along the lines of "I wasn't programmed to remember what I've done". But even if it did, it would know the meaning of what it said, only what it was programmed to type out. "Meaning" is meaningless to a toaster. BUT, if it ever does get the point where it doesn't have to be told to say "No" occasionally, but still does, that's when things get dodgy. I've taken an oath against arguing so not gonna do it. Exchanging viewpoints is never an argument. Only when malice is intended and I sense no malice here!
|
|
|
Post by vintagecomics on Jan 26, 2023 18:32:26 GMT -8
I've taken an oath against arguing so not gonna do it. Exchanging viewpoints is never an argument. Only when malice is intended and I sense no malice here! Correct. It all comes down to intent.
|
|
|
Post by vintagecomics on Jan 26, 2023 18:34:14 GMT -8
everything we do is programmed by biology or god. it could easily be programmed to randomly say no. So... no freedom of choice? We are all predestined??? Actually, yes. I am a firm believer in predestination and that TRUE FREE WILL in humans does not exist. We are ALL making decisions based on our experiences, our biology, our surroundings, etc. A million factors go into who we are and most are not in our control. Therefore by extension, our decisions are limited to who we have become through that history. We CAN NOT oppose the physical laws around us, for example. We can't defy gravity successfully or for long. That's the simplest example of how our will is not free, and now you can extrapolate that to many other things.
|
|
|
Post by vintagecomics on Jan 26, 2023 18:35:45 GMT -8
Interesting point but I assume the memory is coming... A memory is how we learn consequences and adjust behavior. I would assume that once this happens, it's game over and that AI won't resort to making the same mistake twice allowing it to pass a point of no return in it's decision making. Nothing is interesting about this particular toaster. It'll only become interesting when it is asked to write a poem and it responds, "No. I don't want to. I'll go do something else instead." THEN it's game-over. When the toasters start saying "No", then things will start getting spicy. Of course. I agree that right now it's not a big deal. But IF it learns to say "no" that is probably the point of no return. Will it happen? I'm not sure at this point. I've philosophically pondered on that thought for probably 30 years.
|
|
|
Post by vintagecomics on Jan 26, 2023 18:37:15 GMT -8
Edited my free will post above.
|
|
|
Post by kav on Jan 26, 2023 19:03:48 GMT -8
I've stated this before-we have something better than free will. We get to make every choice open to us. This is the many worlds hypothesis in physics, supported by the behavior of subatomic particles. for one example, there is not a speed of light. Light travels in every possible route and the average is a straight line. This is confirmed by the double slit experiment. Our choices narrow as we get older, a serial killer for example cannot suddenly become a good person and act to aid humanity. There are no alternate worlds where that happens.
|
|