Too bad Sophia isn't a fucking being, she's barely more than a branching decision tree jammed inside a sex doll, and stuffed with rudimentary greetings.
although irrelevant because we have no thinking functioning man made entities right now so it really doesn't matter what the fuck we do to them. They are literally objects, you understand that right?
I'm sorry in what way?? You are literally talking about treating the equivalent of a couch with respect. They are not alive, they cannot think, they are not even like animals! Sure, if in the future, scientists manage to achieve true artificial intelligence, then that is a very fair statement, and robot rights will be a significant issue. As of right now it is not. At all.
This logic chain is quite flawed. What are humans but a branching decision tree which parse input and output based on stored data and firmware- and many seem just about capable of little more than flimsy greetings. But- “she isn’t alive...” by default no. A living thing by our definition must reproduce, grow, have respiration, do things that we can’t really argue a machine is capable of- although in theory eventually one could be- although why a machine would be built to respirate is beyond me. So machines aren’t alive but that isn’t the question. The question is one of consciousness.
Are you seriously insinuating that "Sophia" is conscious? You must have all the AI expertise of a kindergartner. Even Sophia's creators have said that they know she is not conscious. She has all the independent agency of a Skyrim NPC. I can't believe I'm having this conversation
Prove to me you are sentient. You can’t. This is actually a serious issue in science. We have no way to prove we are sentient, and the concept of self awareness is flimsy at best. The entire understanding of consciousness is very weak. Its nature etc. we haven’t mapped the mind, we have yet to find a soul during an autopsy or any scans. We know we have big brains and lots of neurons. So far we have been working off a functional idea that by merit of actions we are more sophisticated and posses some latent superiority or even intelligence because we think we do and we all mostly agree on that.
But you can’t prove it. The greatest minds of all human history can’t prove that we aren’t all biological computers just following a program. Philosophy and science both cannot prove there is even a thing such as free will or what it is. So how do we know she isn’t just mindlessly aping what she’s sees and hears? Well.... humans do that too. How do we know your ideas are your own and not regurgitations of what was taught to you and what you’ve absorbed through society? Nature vs nurture has never been settled completely. When a thought enters your brain how do you know that you decided to make it and it wasn’t just the inevitable result of the machinery in your head pardon data in the way it is designed to? We all occasionally have thoughts just pop up in our minds. You may say “subconscious..”
... but... what is the subconscious of such a thing exists? That by default implies that you have processes going on which influence or control your thoughts and actions that you aren’t even aware of. That’s certainly an argument against human self awareness isn’t it? That we aren’t even in control or aware of what is going on in our own heads or why we make a particular decision beyond a declaration that we wanted to or thought that was best. There’s no accounting. You can’t trace a thought and diagram it’s formation. So she humans can’t even understand ourselves let alone the mechanisms that make us somehow “sentient,” how can we judge if something else is sentient?
There are multiple tests to determine whether something possesses independent thought, and I think the coders of Sophia know that what they created does not meet any kind of standards for general intelligence. If you define "intelligence" as being able to respond to certain stimuli with a series of pre-programmed responses, then sure Sophia is intelligent. But then you'd have to make the case that every NPC in every video game "may or may not be intelligent". Maybe I should stop killing villagers in Minecraft in case their branching decision trees indicate that they are conscious. When the trader offers me emeralds in exchange for wheat does this mean he understands economics?
If you don’t know what the word “Ghastraflaxocabrinotious” means- how will you point it out to me when you see it? Our only barometer of intelligence or consciousness is to compare it to ourselves. A little arrogant to begin with- to assume we are the benchmark of intelligence, but also flawed in that we can’t even explain why or of we are in fact sentient or not. And you mention animals- who we generally hold to a lesser standard of intelligence, a lesser or non existent state of sentience...:
And you say that we can't understand the human brain, which is contradictory. Sophia can be easily understood, we can pick apart her code and see exactly what causes her to do what. No so at all with a human brain. The top neurologists of today cannot unravel the mysteries of the brain, human or otherwise, and yet you equate that to a few lines of code that tell a computer how to say "Good Morning".
... but animals are given rights and considerations. We tend to view life as somewhat important or sacred. Why? Because we are alive and so we value life because it has a tangential likeness to ourselves? What of the “enlightened” or “sentimental” soul who has compassion for the worm or the ant? A creature which as far as we can tell is as close to a biological machine as you get. With little or no capacity for reason or thought beyond its genetic programming. No apparent sense of self identification save for an inborn desire to survive and or procreate?
The difference between a low functioning intelligent animal and Sophia is general intelligence. That is the necessary factor for something to be intelligent
... and if we can ascribe any significance to what seems to us to be a simple and predictable existence of such creatures- why would a man made creature be any different simply because it is programmed with a simpler program ran on simpler hardware than we believe we posses? The biggest question is- where is the line for you? What would a machine have to do in order to prove to you that it was sentient and deserving of basic rights? What test could you give it (and not the Turing test as it is flawed and machines have passed it...) to distinguish it is indeed alive? And how certain are you of that test to say that any and all living humans could pass that test? How comfortable would you be if man and machine both had to take the test and any entity which failed would be seen as having no rights as a sentient being?
@jeremy- sorry to “type over you.” I was typing it all out at once and just saw your replies. Now- you keep saying intelligent but have not defined intelligence. We can’t understand the human brain and we can understand hers. We can also understand an insect brain and have controlled insects. Is complexity your definition of intelligence? Is the planet a sentient and thinking entity because we can’t fully deconstruct and reconstruct its mechanics in their entirety, but only understand small parts of it or isolated systems?
First of all, a sentience test seems entirely reasonable. We can hardly have lawyers ascribing fundamental human rights to toaster ovens to exploit loopholes in the law. There is no software in Sophia that allows for her to be conscious of her thoughts, she is a chatbox. If all she does is spit out pre programmed responses, where then does here consciousness reside? There are no transistors set up to allow for that function. Will you now suggest she has a soul?
My point isn’t that she is intelligent or even sentient. My point is we can’t really say. We can’t even prove we are. Our barometer of intelligence essentially hinges on how much like us something is. So you mentioned above that maybe someday AI would be sophisticated enough to be treated as a living thing- and my question is how would we know? You mention NPCS in games- I don’t know- if we remove a sentimental bias to that which is seen as alive- how is murdering them for fun more or less significant than burning ants with a magnifying glass or pulling the wings off of butterflies? How simple does a string of instructions need be before we can destroy it for amusement? You won’t run out of NPCS or ants reasonably. Both will make more. We have no indication that either is more than an autonomous execution of a program with no self identity- any one is the same as another- and most NPCS are programmed with behavior to try not to die just as insects are.
You ask how killing an NPC is different from killing an ant or torturing a butterfly. The answer is simple. Insects feel pain. NPCs have no nerve centers
@jeremy- we have thoughts we are not aware of- and we have no way to know what she is aware of. We don’t “see” any proof of consciousness when we examine her- we don’t see any proof when we examine us either. Science can’t define the mind beyond a mechanical process created by the triggering of nuerochemical reactions. Where does our consciousness reside? Because we cant see our consciousness is proof it exists? That isn’t scientific. How do you prove humans aren’t just chat boxes spewing pre programmed responses? If you put 100 kids hands on a hot plate, wont they all mostly pull t away? If you threaten someone- in that moment will they not either try to fight, defuse, or shut down?
@jeremy- fact check- insects have simple nerve systems. Insects have no pain receptors. Every thing we know about science says that insects are incapable of feeling pain in any way we would recognize. The closest theory is that ants have a sense of nociception- that they can understand there is damage but not feel pain. Like a car with a check engine light on or a computer that boots in safe mode.
If I were an astrophysicist, I'd consider my intelligence greater than that of my car mechanic. Yes, I'm smarter. If my car mechanic designed a test to measure my intelligence based off his experiences, I'd fail it.
Paraphrasing I quote I saw, forgot by who.
Anyway, that robot's "mind" is more complex than a single cell organism's "mind", yet single cell organisms are most definitely considered alive.
Does something have to be alive to be sentient? Does something have to be intelligent to be sentient? @guest_ is right, ultimately, you can't actually prove any of it.
This is a huge trope in science fiction, I mean... just look at I, Robot, even the movie version, which not bad, also isn't very good.
Oh, we actually can map the brain and show something similar to what a programmed computer would do. We've only been able to do it with optics (both projecting an image in the mind and then reading the neurons firing to recreate an image that someone sees). There is a framework, we just don't understand all of it yet. It's like trying to build a LEGO landscape with only one color of brick.
Also, we tend to reflect on intelligence through our own behavior: i.e. "The way those whales worked together to trap those fish was genius!"
A robot doesn't need to eat, sleep, piss, shit, breathe... how would you measure it's intelligence if it's not even remotely close to having the same motivations?
Well said @funkmasterrex. It all ties in to both science fiction and reality of ideas of artificial intelligence, alien life forms, or even “gods” or extra dimensional beings. We cant recognize something that we don’t know what it looks like and comparing things to us tells us how “human” they are and little else.
what about a machine that solves a 1000 square per side rubix cube? it actively makes decisions. It's given a list of algorithms and it sits and decides which one would be the best to use in any given situation. Or an AI programmed with neat so that it actively learns "oh thats bad, oh that made me lose 10 seconds earlier, oh if i do this i got closer to the coin, etc" It learns and it combines what has worked and what hasnt worked into a new plan that it generates based on past experiences and trial and error learning. It does this multiple times faster than a human does when set up correctly.
That gets back at the motivation thing, and let's face it, when those things do what they are programmed to, they are "motivated" and QUICKLY correct any mistake then never "forget" it. That kinda sounds like intelligence. Then this goes back to sentience. Does something have to be alive to be sentient? And.. I don't know. There's no way to know if the electrical charge skirting across a microchip is like a neuron firing. Even if you wanted it to communicate what it was experiencing back with you the results would be tainted, as it can only describe it back in whatever languages it was programmed to respond to or phrases it learned. This exact same problem of information exchange is also found between humans. You'll never really know exactly what someone else is thinking; Hell, you can't even prove you aren't just in a simulation. What if YOU'RE THE ROBOT!?! Lol.
Lol. In a sense we are. Biological life as we know it has had hundreds of billions of years to adapt and develop to its tasks. But beyond survival and self perpetuation what is our purpose? We are pre programmed to survive and reproduce. The rest stems from our attempts to realize those conditions or to “solve” those puzzles. In that vein- if we consider sentience some ability to defy or expand our base program- we are saying sentience is in essence a glitch in a machine. A machine which doesn’t behave according to its programming.
Thank you savage. Hopefully I’ll get time today to do just that. I started out reading Verne and Wells and similar classics of sci fo as a child, and of course Star Trek TOS, lost in space, etc. Bradbury and Heinlein became my favorites when I was a little older around 8-10, then Asimov. Lots of classic sci fi asked the types of questions about who or what we are and the nature of reality, and they are important questions to ask.
Paraphrasing I quote I saw, forgot by who.
Does something have to be alive to be sentient? Does something have to be intelligent to be sentient? @guest_ is right, ultimately, you can't actually prove any of it.
This is a huge trope in science fiction, I mean... just look at I, Robot, even the movie version, which not bad, also isn't very good.
A robot doesn't need to eat, sleep, piss, shit, breathe... how would you measure it's intelligence if it's not even remotely close to having the same motivations?