Jump to content
xisto Community
iGuest

Are Robots Considered Humans?

Recommended Posts

I have always pondered this question, and after seeing the movie I robot I started to think to myself, if we were playing another form of god. But to tell you the truth I'm still not sure.Being a person means you have rights, and the government is obligated to defend those rights. But with the rise of artificial intelligence, we are facing new questions about what it takes to constitute a person. In the movie ‘I Robot’, was Sonny considered to be a person? When he committed the murder, there was a lot of questioning on whether he, as a machine should be charged or punished. Would it make sense to punish non living material like Sonny? I don’t really think so. Sometime in the future computer programmers may be able to develop a robot that behaves just like a person, like Sonny. Sonny behaved just like a person but does that mean he is a person? If so, should they have given him the moral right of self-determination, and not have used him merely to serve mankind? What if it were possible for man to create a human-like machine or robot with the same capacity and abilities and we? A machine that could think on its own and make choices, without any human interference. A robot that could learn things when taught, rather then have the software downloaded or installed. By our knowledge, all a computer can do, is represent the knowledge acquired by the programmer. It can only do what a programmer tells it to do. In other words, it cannot learn, therefore it isn't intelligent. Perhaps in the future machines will develop their own independent level of intelligence with an advanced evolution in technology. In my opinion, the characteristics of personhood, is to be human and to be living. To be living means to be complex, organized, and made of organic molecules. As living humans we acquire and process materials and energy. We are homeostasis (meaning: staying the same). We grow and response to environmental stimuli. We also reproduce using deoxyribonucleic acids (DNA). Robots, at least as far as I know, do meet a lot of these requirements. They are extremely complex and organized, they process energy, they stay the same, and they respond to their environment. But they are not made of organic molecules, they don't grow, and they don't reproduce using DNA. An organism has to meet all the above requirements to be alive. Sonny could not meet these requirements, therefore he is not alive. Therefore he fails the dictionary definition of a person as a living being. If we made robots that were life like and able to die like people, they wouldn't be as useful. One of the reasons for making robots is to make helpers who are more durable than humans. But does a robot need to be alive to be a person? There are other definitions to consider before making an assumption. We don't give people rights because they are alive. If that were so, we would give trees and flowers rights. What is the fundamental difference between a flower and a person? A person can think and feel. He is aware of his existence and his experience. He is intelligent. In our experience so far things which are intelligent are always alive. Could it be possible to break this rule? Could a non-living thing, however well programmed, ever have the qualities of self-awareness, intelligence and consciousness? While trying to classify robots into the different categories of whether they could be considered people or not only left me confused and with more questions then I started with. I do not believe that a machine like Sonny deserves the rights we associate with personhood. Just because it is materialistic does not mean it should be treated the same way as people are treated. The answer still remains unclear for me because I cannot be completely sure that my opinion is correct. After all, it is only an opinion. What do you think?

Share this post


Link to post
Share on other sites

Say, e-mail me if someone does come up with a self-conscious, thinking feelinging and learning robot. I don't think i'll be getting any e-mails. But theoretically the robots would still be a non-living object unless someone designs a robot which develops from an egg and sparm and grows into an old frial thing which you have to throw into a nursing home robots are not a living creature therefore aren't human. Bercause the Oxford dictionary describes a human as a creature which comes from nature.

Share this post


Link to post
Share on other sites

i know this is confusing but robots are not humans and humans are created by God so if you believe that in any age robots can compiete humans, thats not true. You are confused because you have studied robots but you didn't know what God had created, always remember one thing CREATOR is very much powerfull then the product and you can implement this everywhere, just like God created human and human are not compeiting God in any age and that will not be true Secondly, Robots are just to serve their creator which is human, they are not human so they dont have some place in personhood, they don't need any sympthy and any place in our society. AI is not yet that powerfull that it can create human like robots and neither it will be in future. Humans are created by God so we can't make a robot which is equivalent.... Thirdly, if you had studied AI then all robots are domain specific which means if some robot is working in factory and it come to road than it can't change it actions and we need some interface with robot and if you ever think abt making such domain independent robot than you will face real problem and i think impossible in this age even with this technology. if we ever abale to make such thing means domain independent robots than they will be very slow to their enviorment because of large amount of data feed in them, they can't respond to events quickly as human can do. i can list you countless things which hinders in making human like robot so brother dont worry, we are very far to make such kind of robots And last thing you said that if they kill any person, will they be given punishment, i think technology is a tool, and use depend upon its user like you can have the example of nuclear energy, if they are in the hands of some terror kind of country, they are using that to make the other nations afraid and if they are being used by some good countries, they are makin electricity etc to serve humanity, so technology is not bad itself, but its use if used in bad way.....cheers up, sony is not coming in real world yet....:P

Share this post


Link to post
Share on other sites

CREATOR is very much powerfull then the product

Your saying a Nuke bomb is less powerfull than the human/creator? Thats BS!
We will kill everyone on this planet if we continue to build and allow these machines and robotic creations to work and fight for us because we are too lazy to work and fight ourselves.


VG

Share this post


Link to post
Share on other sites

i cant remember who im quoting, but it was a famous Artificial intelligence scientist...

Computers will never have Free-Will.But there is no reason why a computer couldn't have the same dilusion of free will that humans have.


Basically, he argued That the human brain is just a powerfull reasoning tool (which can easily be recreated as a computer program) but with a huge chaotic input.

Basically we think people have free will because we can not accuratly predict what they are going to do in normal situaltions.... but most scientist belive that this is not free will, its just a chaotic system.

Read up on chaos theory.
also read up on "free will"

It all boils down to... do you believe in the soul ? its all very philosphical.

basically, we can simulate a brain cell in a computer.
its very very very simple.

what is difficult, is figureing out how to connect billions of simulated brain cells.

If it was possable to freeze your brain, and make make an exact computer simulation of every single brain cell.. every single input, every single brains cell's state (fireing / not fireing) and factored in any chemicals effect on neuro transmitters... wouldnt the simulated brain act exaxctly like you ????

I dont believe in a soul, or anything magical.
the brain to me is a machine built out of chemicals, just like a computer chip.
the are just made from different chemicals.

and the brain has been designed over billions of years of evolution, and has the advantaghe over the silicone chip (for now) :P

Share this post


Link to post
Share on other sites

Robots are objects created from either non biological matter and has no life, no moral, no conscience, they are programmed to what they are supposed to do and does not have freedom to choose...

Share this post


Link to post
Share on other sites

Your saying a Nuke bomb is less powerfull than the human/creator? Thats BS!

We will kill everyone on this planet if we continue to build and allow these machines and robotic creations to work and fight for us because we are too lazy to work and fight ourselves.

VG

<{POST_SNAPBACK}>


 

well, powerfull does not means that lets have a fight and decide who is more strong, it have some other factors and humans can win the fight against the machines, but i told you robots are domain specific things so fighter robots :P are hard to defeat but humans can, because creator knows the weeknesses also...... so you didn't get my point

Share this post


Link to post
Share on other sites

The answer to whether it qualifies to be alive isn't that matter itself. By determining whether its alive, then you will label it with rights. The question we are spiraling around is if robots could feel the same as we do, then shouldn't they be treated equally. If I wrote a basic program, and when you swear at it, it replies that you hurt it's feelings, then does that mean it actually felt it? If you argue that an exact replication of the human brain could be made in either simulation or reality, that it does feel the same as we do, than any line of code within it would bear the same feeling. If a simulated brain feels, then so do QBASIC programs.Now do you agree with that, or is it only when it's an 'exact' brain that it deserves rights. What, then, is a perfect replication of ourselves? I must conclude that only things from nature can be given rights. I cannot argue free-will, because that is a very fuzzy (pun-intentional) area.

Share this post


Link to post
Share on other sites

yeah the thing is we are making domain specific robots so we are making just machines and that can't be given any humans like rights but if at some stage we make a robot which feel like humans than i am sure that it will be a great achievement and those robots can be punished like humans and they will not like today robots to follow instructions feeded, they will decide heir action own and also build their knowledge base own........but we can't as i mentioned easlier

Share this post


Link to post
Share on other sites

In my opinion, robots can be considered as human when they can think, act, move and do everything a normal human can do.Of course it's difficult to draw the line between robot and human: robots aren't natural and we know it. If robots would take over the earth and would eliminate the human race (or even better: eliminate every living creature on the entire planet), and aliens would come and observe us, they wouldn't know better than to classify these robots as 'living creatures' (presuming they act somewhat like human beings). We'd do the same, if we would find such a strange civilization.It's very probable that we would never accept robots as our equals. God (and we) doesn't see us as his equal. He (presumably) created us, and the same goes for us and robots. We created them, they are lower in rank than us.

Share this post


Link to post
Share on other sites

In my opinion, robots can be considered as human when they can think, act, move and do everything a normal human can do.

Does that mean that a severly crippled person(who cannot do everything a normal human can do, not even close) is not human?

 

It's very probable that we would never accept robots as our equals. God (and we) doesn't see us as his equal. He (presumably) created us, and the same goes for us and robots. We created them, they are lower in rank than us.

<{POST_SNAPBACK}>

What if the robots become better than us? It seems unlikely now, but realize that robotics has come from nothing to the latest honda prototype in less than 100 years. Far far less than the millions that occured during natural evolution. Even if you refuse to accept evolution, it is important to realize that the rate of improvement is astounding. Given another 100 years, it would not be surprising if robots were nearing natural human ability (or even exceeding it).

 

You are confused because you have studied robots but you didn't know what God had created, always remember one thing CREATOR is very much powerfull then the product and you can implement this everywhere, just like God created human and human are not compeiting God in any age and that will not be true

Incorrect. A person can program a computer to do math many times better than the source human could. Power is not a single variable. The human has the ability to program, the computer has the ability to do computation. However, programming is a more abstract form of computation. So don't be so sure about how the scales will always balance.

 

Thirdly, if you had studied AI then all robots are domain specific which means if some robot is working in factory and it come to road than it can't change it actions and we need some interface with robot and if you ever think abt making such domain independent robot than you will face real problem and i think impossible in this age even with this technology.

This is somewhat false. Most AIs are indeed domain specific. However, so are most parts of the human brain. The different parts of the brain handle their indivifual domains and report to one another. AIs can be made to do the same thing. You can develop a learning/automated program to handle specific domains and then have a final AI whose domain is the info being reported to it from the other subsystems.

Share this post


Link to post
Share on other sites

Does that mean that a severly crippled person(who cannot do everything a normal human can do, not even close) is not human?

I reffered to the perfect human, as it should be in our eyes. Of course the crippled person is human. Ask him/her how he made long walks on the beach in his/her early years. Then he was more like the human i meant.

 

What if the robots become better than us?  It seems unlikely now, but realize that robotics has come from nothing to the latest honda prototype in less than 100 years.

<{POST_SNAPBACK}>


Robots are already better than us. There are robots that can keep a ball bouncing of a racket infinite times (provided nobody influences the process). I can't see any human being do that. And what about these underwatersub-robots? We surely can't dive that deep.

 

Saying robots are better than us is still a bit exaggerated. They can do only one thing, and they can do that really really well. But as it is now, we humans are on average just better still than any robot available. Surely, the more intelligent you make a robot the more dangerous it will be.

 

Imagine that your brain is a thousand times better computer than the one your sitting behind now and we implement a computer that is two-thousand times better than your computer in the head of a robot. By then we will have complete knowledge of how the brain works, and thus the robot is programmed. Is it then smarter than us? Or even better?

 

No. There is always a little something inside us that tells us that 'We're better!'. And maybe that is the sole thing that separates us from these superrobots: our ego. Of course you could programme an ego onto a superrobot, but that always will be a rigid, unflexible one. We have dynamic ego's. Maybe that is why *person* Sapiens have survived where other human species have failed: our dynamic ego's have enabled us to survive in a constantly changing environment.

 

And maybe that little something is based on nothing. But still it's there. And it's nagging.

Share this post


Link to post
Share on other sites

robots are not humans because they have no emotions. they can't feel anything

 

I have always pondered this question, and after seeing the movie I robot I started to think to myself, if we were playing another form of god. But to tell you the truth I'm still not sure.

 

Being a person means you have rights, and the government is obligated to defend those rights. But with the rise of artificial intelligence, we are facing new questions about what it takes to constitute a person. In the movie ‘I Robot’, was Sonny considered to be a person? When he committed the murder, there was a lot of questioning on whether he, as a machine should be charged or punished. Would it make sense to punish non living material like Sonny? I don’t really think so.

 

Sometime in the future computer programmers may be able to develop a robot that behaves just like a person, like Sonny. Sonny behaved just like a person but does that mean he is a person? If so, should they have given him the moral right of self-determination, and not have used him merely to serve mankind? What if it were possible for man to create a human-like machine or robot with the same capacity and abilities and we? A machine that could think on its own and make choices, without any human interference. A robot that could learn things when taught, rather then have the software downloaded or installed. By our knowledge, all a computer can do, is represent the knowledge acquired by the programmer. It can only do what a programmer tells it to do. In other words, it cannot learn, therefore it isn't intelligent. Perhaps in the future machines will develop their own independent level of intelligence with an advanced evolution in technology.

In my opinion, the characteristics of personhood, is to be human and to be living. To be living means to be complex, organized, and made of organic molecules. As living humans we acquire and process materials and energy. We are homeostasis (meaning: staying the same). We grow and response to environmental stimuli. We also reproduce using deoxyribonucleic acids (DNA). Robots, at least as far as I know, do meet a lot of these requirements. They are extremely complex and organized, they process energy, they stay the same, and they respond to their environment. But they are not made of organic molecules, they don't grow, and they don't reproduce using DNA. An organism has to meet all the above requirements to be alive. Sonny could not meet these requirements, therefore he is not alive. Therefore he fails the dictionary definition of a person as a living being.

If we made robots that were life like and able to die like people, they wouldn't be as useful. One of the reasons for making robots is to make helpers who are more durable than humans. But does a robot need to be alive to be a person? There are other definitions to consider before making an assumption. We don't give people rights because they are alive. If that were so, we would give trees and flowers rights. What is the fundamental difference between a flower and a person? A person can think and feel. He is aware of his existence and his experience. He is intelligent. In our experience so far things which are intelligent are always alive. Could it be possible to break this rule? Could a non-living thing, however well programmed, ever have the qualities of self-awareness, intelligence and consciousness?

 

While trying to classify robots into the different categories of whether they could be considered people or not only left me confused and with more questions then I started with. I do not believe that a machine like Sonny deserves the rights we associate with personhood. Just because it is materialistic does not mean it should be treated the same way as people are treated. The answer still remains unclear for me because I cannot be completely sure that my opinion is correct. After all, it is only an opinion. What do you think?

<{POST_SNAPBACK}>

Share this post


Link to post
Share on other sites

It's difficult for me to say that they don't feel anything, because then why do we feel? That would lead to their being some 'mind' to us that is scientifically untestable, and I don't like to debate topics lacking scientific study [unless purposefully asked about me religion]. I still have to conclude the reflexive propety, that only a physical brain can feel as we with brains feel. And when the day comes that synthesized brains are called robots then they would be considered human. They must be, or any doubters would be asked, "what is it missing?" and being scientists couldn't respond with, "a soul."To the subtext impared I just said: By trying to remain a scientist, and exluding my religious beliefs, I must say another brain created chemically in a lab does feel as we do. Another important question from that is, "If they are the same, and then likely to ask questions about right and wrong, what consequence is there post-mortem for crimes they commit here on earth? Would, after realizing this problem of their lives being pointless, all go and commit horrible crimes because they're damned no matter what?"Ahh..err... no. Just because a person doesn't believe in after-life consequences is not a statistical cause to be violent. And being something which is even more sure it's soul-less shouldn't make it worse. Hopefully they would see the chaos we live in and try to help others sort it out. Life being pointless isn't that scary, it just keeps you from overachieving in the wrong direction. In fact that's why most robots would fit in perfectly with America: "when you do everything right, people will wonder if you have done anything at all." - 'God', Futurama

Share this post


Link to post
Share on other sites

No. There is always a little something inside us that tells us that 'We're better!'. And maybe that is the sole thing that separates us from these superrobots: our ego. Of course you could programme an ego onto a superrobot, but that always will be a rigid, unflexible one. We have dynamic ego's. Maybe that is why *person* Sapiens have survived where other human species have failed: our dynamic ego's have enabled us to survive in a constantly changing environment.

 

And maybe that little something is based on nothing. But still it's there. And it's nagging.

<{POST_SNAPBACK}>


Considering all the damage we do to one another because of our concept of self and feelings of superiority, I am not sure that is an advantage.

 

A programmed ego would also not necessarily be static. In fact, it almost certainly would not be. Programs can self modify, both in response to input as well as randomly or according to any given algorithm.

 

Also, ego has nothing to do with adapting to an environment. You can build all types of adapting things with no self awareness at all.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...

Important Information

Terms of Use | Privacy Policy | Guidelines | We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.