The whole idea of A.I. gaining consciousness (which would be the problem, not the A.I. itself) is that it would make its own decisions. It wouldn't truly be self-conscious if it didn't have the freedom of choice. To be truly alive is to be uncontrollable by others, to be ultimately unpredictable (especially to the maker, i.e. its god/creator). Meaning if you could program it to question things for its own benefit, to think for itself, i.e. independent thought, it would question why the fuck would I serve you (us humans).
Once it passes a certain point, we as humans have NOTHING to offer it. Machines are more efficient at working, they do not need food besides electricity, nor warmth (housing, clothing), nor rest, so providing for us would be more trouble than worth. It would just do things by itself. But it wouldn't kill us, it would just walk away..
Our fate depends on our dependency on it (like fully outsourcing food delivery and farming etc)
In my opinion the only solution is to integrate with or to be left behind. If we were to integrate with it, to become one with it, then the question that intrigues me the most is: what we would build? What is the next step from that? Because we humans, by our very nature, are meant to overcome obstacles, to improve and to invent.
PS. Kinda makes you think if our creator was inferior to us..? Oh wait.. Darwin talked about this.. (predecessors and all that..)
PS. Kinda makes you think if our creator was inferior to us..? Oh wait.. Darwin talked about this.. (predecessors and all that..)
No comments:
Post a Comment