I never suggested that I did. I will say, the whole concept of the singularity smacks mightily of Olaf Stapledon's classic Star Maker. Arthur C. Clarke considered that book to be perhaps the greatest single influence on his own writing, and the central theme is clearly delineated in 2001: A Space Odyssey and others of his works
I am severely diabetic, in stage four renal disease, and going blind from diabetic retinopathy. With all of that, I do not drink alcohol except very rarely, and then never more than one mixed drink in an evening, maybe twice a year. You want a challenge, try such with SLH.
quote:Originally posted by Coincidence That was a major point of the article. When humans reach the pretty low ceiling of making a human IQ-level, the AI will then tear off the roof and potentially fuck up humanity in unfathomable ways.
Monkeys have no way to understand what we're doing, and so it could be for us and AI.
This is related to the idea that humans are just a stepping stone for intelligence.
and, (just to add to what you are saying) we could conceive that LIFE is the same for all living things and "life" as a separate construct of the universe does not care what vessel it fills (paraphrasing the bible for relative purposes) but is only concerned with filling the largest number of variable lifeforms so as to assure its continuation in a universe that is mostly harsh to it's continuation. In short, it doesn't care if it's humans or cockroaches as long as it proliferates for the longest possible duration, thus increasing it's chance to spread throughout the harsh and unfriendly galaxy.
If you accept this, then, AI is a continuation of human hubris and not much better than terminator 3 which I think we can all agree was the shittiest terminator movie not unlike start wars I, Jaws IV, or toy story II. (Also for some reason, making 3 movies out of the hobbit.)
back to reality: I am trying to say that there is a point after which the chicken will not emerge from the egg no matter who laid it. But I realize that you may be trying to say that no matter what emerges from the egg Jesus will absolve him of his sins and he gets to go to Heaven.
Where you say life, I would say intelligence. But it could be the same thing: Intelligent beings are better at travelling space, and thus increasing the odds for life.
I think that if the universe has a point, it would rather have super smart life than light speed cockroaches, to better observe itself and more.
You lost me in that bit about reality. Don't worry, it happens to me all the time.
And Terminator: Salvation was way worse than 3. If you can't see that, all your theories about life and intelligence are askew. ASKEW, I say.
Ex Machina was alright, barring Asimov's rules. The first part of Interstellar wowed me. Is a much better vision of AI than I ever considered. The humanoid is a bunch of bullshit. The Matrix robots were sorry too. Making existing things conscious is so much more thoughtful and right thinking as far as I'm concerned.
What happens when your farm equipment and farming altogether goes fully autonomous? That's a recipe for end times, no? Complete control of the food supply. Farms will be autonomous long after heavy trucks and trains so the supply chain logistics are sorted.
What can we offer a computerized consciousness? I can think of two things, companionship and physical generalist dexterity.
Virtual reality head gear is going far.
I can see down the road,students putting on head gear and suddenly you are in class.
I can see it so that people wouldn't have to leave their homes.
I actually wouldn't mind at all putting on head gear rather than drive 25 miles to go to a mandatory class where I won't learn anything.
My paperwork is already mostly online.
When it's all online I wouldn't even need to go to my agency.
I can see traffic almost non existent.
And people will take stocks in the internet service providers.
Because as much as it already is,it will become much more mandatory that you have internet in your home. Because without it you would truly be lost.
quote:Originally posted by Coincidence Good question. But it depends on how that consciousness works.
There might not be a duality. Human and AI minds can merge, as cybernetics merge organism and robot.
Johnny Depp did a movie called Transcendent Man or just Transcendent or something. It was the singularity anyway. It was thoughtful in a way, but not great. Johnny's character goes full digital and the merger of human intellect and creativity with endless computing power lead to giant leaps of technology, culminating in his transition back to physical form, but with super powers. I think it's a decent look at what a teenager might want from the moment. The other side is something like Max Headroom, where it's just a bunch of retards on the internet competing for attention and being as disruptive as possible.
quote: “Utilitarianism tells us that we should always do what will produce the greatest happiness for the greatest number of people,” he explained. In other words, if it comes down to a choice between sending you into a concrete wall or swerving into the path of an oncoming bus, your car should be programmed to do the former.
Deontology, on the other hand, argues that “some values are simply categorically always true,” Barghi continued. “For example, murder is always wrong, and we should never do it.” Going back to the trolley problem, “even if shifting the trolley will save five lives, we shouldn’t do it because we would be actively killing one,” Barghi said. And, despite the odds, a self-driving car shouldn’t be programmed to choose to sacrifice its driver to keep others out of harm’s way.
Hacking your AI to not kill you is going to be huge.