Page 2 of 3

Re: Sentience is meaningless. Sapience is what matters.

Posted: Sat Jul 21, 2018 2:26 pm
by brimstoneSalad
Try to avoid unfalsifiable solipsistic nonsense like carnap's above (he does that a lot).

Imagine that same thing said a couple hundred years ago about race:
1.) Christians tend to anthropomorphize negro behavior. For example someone said:

"Any negro who starts becoming aggressive when you bother him/her very obviously wants to stop being bothered,"

That is how you'd interpret the behavior if it was Christian, but the cause of the behavior in a negro could be entirely different. In particular aggressive behavior doesn't require any notion of complex notion of self, it could be a purely instinctual response to particular stimuli.


People actually said stuff like that against equal consideration, and it's disgusting.

You can map that nonsense to ANYTHING. Even the self vs ANYBODY else:
1.) I tend to anthropomorphize others' behavior. For example somebody said:

"Any other human who starts becoming aggressive when you bother him/her very obviously wants to stop being bothered,"

That is how I'd interpret the behavior if it was my own, but the cause of the behavior in any other person could be entirely different. In particular aggressive behavior doesn't require any notion of complex notion of self, it could be a purely instinctual response to particular stimuli.


Great wisdom to live by as a serial killer! Never have to feel remorse because they're not even real people like you! :roll:

The implication is that we should all be solipsists or racists or whatever we want because you can appeal to radical skepticism about others' intentions forever and it's always unfalsifiable.

It's intellectually dishonest to make arguments like that.
If that doesn't seem acceptable to say about yourself vs others, or your own race vs. others, maybe don't say the same about non-humans because you're just supporting that morally bankrupt reasoning and it can be applied anywhere.
That's why, again, we should stop speculating on unfalsifiable claims of what is or isn't going on in the black box of the mind and just look at behavior and the implications of that behavior unless or until we find a way to prove something more.

A moral person errs on the side of caution when inflicting potential suffering upon others.

Re: Sentience is meaningless. Sapience is what matters.

Posted: Sat Jul 21, 2018 2:49 pm
by Sapientist
carnap wrote: Sat Jul 21, 2018 1:49 pm The definition in those dictionaries is vague but consistent with what I said.
Well, sure it's consistent with what you said, but that's not the point. The point is you are adding stuff on to the definition to help your point. You could add a lot on to the definition and still be consistent. The difference I don't think my interpretation is adding anything on, but rather taking it literally.
But you're looking up the term in the wrong sort of dictionary, those are common language dictionaries but "sentience" is a technical term in philosophy (and science) so you need to look it up in the Oxford dictionary of philosophy or something similar.
I've debated this a lot and have never heard this before. I don't have a copy of that dictionary readily available. Do you? The common language definition seems to match the definition used in biology.
That depends what you mean by "self-awareness".
How? I mean, it means awareness of ones self, that's it. Can you give any supported definition of self-awareness that would allow for an entity to not be aware of what is happening to them, barring states of unconsciousness or the like?

Using the roomba again as an example, it has 'experiences' but almost certainly has no sense of self nor awareness of it's self.
As I pointed out above, you wouldn't use the dictionary definition because its entirely vague.


The problem with using your definition though, is that there is no source for it, except maybe Singer's or other authors writings. As I said, it's a definition vegans seem to have collectively agreed on, and I've never found a source for it.

You can dislike the dictionary definitions, but you can't expect me to just take your definition at face value. If you have a source for a more context specific definition, I'm happy to use it. Until one can be provided (and I am trying), I would prefer to stick with the common language definition, which I think is reasonable.
Very few scientists argue that insects are as a whole are sentient.
Everything I've read on the subject indicates that the consensus absolutely is that insects are sentient, with the caveat that sentience is a scale, that they are not capable of reasoning etc (which isn't required by the base definition of sentience anyway).
There is a study where ants passed a basic mirror test but that doesn't mean they are self-aware and its never been duplicated, assuming the results are true the far more likely explanation is that its just a simple heuristic. Ants have very poor eye-sight and their brains are tiny, ant behavior is based on a variety of mindless heuristics that result in emergent properties as a group. Ants have a grooming routine so there is probably something about the dot that triggers it.
I had thought ants passing the mirror test had been replicated? Huh. I agree however it was not evidence of self-awareness and some other explanation is more likely. Bees are a much more interesting case as they display emotions, seem to have memories (IIRC), and by all indications are self-aware and sentient, by any definition.
But this is another issue, like sentience we have no good test for "self-awareness".
Right, we don't just have one single test, but we have a variety of tests we can do, and a variety of behaviors we can observe that can give us a pretty good idea.
This is only a problem when you're using the colloquial notion of "sentience", the notion used in philosophy (and science) is far more concrete.
So, while writing this reply, I managed to get access to "The Cambridge Dictionary of Philosophy 2nd edition", which uses the word sentient in a number of other definitions, but does not provide it's own definition of the word, which means it is using the common language definition. The science dictionaries I found online also match the common language definition.

Cambridge is a pretty reputable source, so it would seem at least in some cases it is the common language definition of sentience that I provided above which is used in philosophy. I will continue to try and get access to the Oxford dictionary of philosophy, but I somewhat expect the same result.
Though I agree, when the average person talks about "sentience" (including vegans) what they are talking about varies a lot from person to person and there is little clarity. And this creates a mess when they try to align it with the more concrete philosophic/scientific notion. For example, for them "sentience" mean imply a sort of sapience but when the scientists talk about sentience that isn't what it means at all. This sort of equivocation causes many vegans to falsely believe scientists support what they think about animal cognition.
I've absolutely encountered this. I really do think the star trek wikia page, despite being star trek, has one of the best explanations of the different definitions of sentience and all the ways the word is used. In my experience many vegans arguing for sentience argue as though sentience was synonymous with higher level consciousness and self-awareness, which threw me off when I first started debating and researching the vegan position.

Re: Sentience is meaningless. Sapience is what matters.

Posted: Sat Jul 21, 2018 3:18 pm
by Sapientist
brimstoneSalad wrote: Sat Jul 21, 2018 2:15 pm Sentience implies some bare level of self-awareness to use those senses in an intelligent way to benefit the individual.
I don't think any level of self-awareness is necessarily implied. As you and others have said, it's a scale, but at the bottom of that scale I think you can be sentient and not self-aware at all. In the sense that sentience is having enough of a base operating system to process senses, at least.
Yes, some robots with complex neural networks are sentient in the way of an insect, but that's more of a question of true learning/understanding those senses vs pre-programmed reflex.
My contention is that most non-sapient animals rely almost entirely on pre-programmed reflex.
And on the other end, we have humans who don't really even know they're humans or what that is or means. I'm not talking about people who are mentally retarded, I'm talking about the majority of the population in some areas.

E.g. "You don't have a soul. You are a soul."

Most people don't actually know what they are, they think they're magical floaty ghost things -- so much supernatural halitosis belched into mud -- sitting in a shell and operating it until the angels call them home.

Is that self awareness? Or is it OK to kill basically anybody who has a slightly inaccurate or incomplete concept of self?
I mean....there is no doubt that the humans in your example are self-aware. To even discuss or consider the things they are discussing requires self-awareness, not to mention metacognition.

Stupidity or irrationality or any kind of thinking has to be built on top of self-awareness as well as a host of higher level functions, so you can't use the result of self-awareness as evidence against self-awareness.

The trouble you'll find is explaining at exactly what point you can arbitrary say something has an accurate and complete concept of self. I think you'll find that distinction impossible to substantiate.
Absolutely we need to consider beings along a spectrum, but our best bet is to do that relative to something we can map properly like sentience/intelligence rather than something we can't really define like self-awareness. Leaving the black box of cognition alone and focusing on the clues we get from behavior is much more tenable.
I don't think it's as hard to define self-awareness as you imply. Harder to test for, but not to define.

Re: Sentience is meaningless. Sapience is what matters.

Posted: Sat Jul 21, 2018 3:43 pm
by Sapientist
brimstoneSalad wrote: Sat Jul 21, 2018 2:26 pm That's why, again, we should stop speculating on unfalsifiable claims of what is or isn't going on in the black box of the mind and just look at behavior and the implications of that behavior unless or until we find a way to prove something more.
We can prove a lot more now. It's not a complete mystery. We have decades of research, numerous tests, observations of behavior through generations and across different environments, as well as all of our understanding of neuroscience and brain structure.
A moral person errs on the side of caution when inflicting potential suffering upon others.
Agreed, but killing does not equate to suffering. Killing a being without pain or fear that is not self-aware is not any kind of suffering, far as I can see.

Re: Sentience is meaningless. Sapience is what matters.

Posted: Sat Jul 21, 2018 7:16 pm
by brimstoneSalad
Sapientist wrote: Sat Jul 21, 2018 3:18 pm I don't think any level of self-awareness is necessarily implied. As you and others have said, it's a scale, but at the bottom of that scale I think you can be sentient and not self-aware at all. In the sense that sentience is having enough of a base operating system to process senses, at least.
0 self awareness means 0 sentience. I'd agree that's likely that many organisms like oysters and some worms aren't processing anything at all, just reacting on reflex: but that means none of either sentience or self awareness at the same time.

The key there is the processing of the sense input: what are they processing it with respect to?
For a Darwinian creature, they're processing it relative to their selves; their place in the environment, and to determine what they should do.
I don't think you can get any meaningful processing without some implied and very crude sense of self, even if that is just a sense of relative position (which is the rudiments of a sense of self; one of many parts of the full picture).
Sapientist wrote: Sat Jul 21, 2018 3:18 pmMy contention is that most non-sapient animals rely almost entirely on pre-programmed reflex.
Most in what sense? If you're talking worms and plankton (which probably outnumber the rest of us) then that may be true.
But if you're talking macrofauna then you are certainly wrong.
Experiments have proved operant conditioning (true learning, definitely not pre-programmed) in higher animals. Insects are more in the grey area, there has been indication of true learning in many insects too (even if quite crude).
Sapientist wrote: Sat Jul 21, 2018 3:18 pmI mean....there is no doubt that the humans in your example are self-aware. To even discuss or consider the things they are discussing requires self-awareness, not to mention metacognition.
But they literally aren't fully self-aware, they're mistaken. If you want some complete self-awareness they lack it. They're "aware" of something that doesn't even exist. They aren't aware that they are primates evolved from simpler life forms with minds running on wetware that will vanish when they die. There's no self-awareness to any of that.

Why do you think that to discuss those things requires that knowledge?

They have awareness of things like relative position, but so do insects. As with insects, part of the picture is there but it's not complete.
Sapientist wrote: Sat Jul 21, 2018 3:18 pmStupidity or irrationality or any kind of thinking has to be built on top of self-awareness as well as a host of higher level functions, so you can't use the result of self-awareness as evidence against self-awareness.
No, you're just begging the question.

That's not self-awareness, it's delusion. They aren't fully aware of what they actually are. Just in the way an insect doesn't know it's an insect, they don't know they're material beings and they have no valid concept of death.
Sapientist wrote: Sat Jul 21, 2018 3:18 pmI don't think it's as hard to define self-awareness as you imply. Harder to test for, but not to define.
Then please, define it. Explain how a Christian is fully and equally self aware when that person doesn't really know what he or she is.

Sapientist wrote: Sat Jul 21, 2018 3:43 pm We can prove a lot more now. It's not a complete mystery. We have decades of research, numerous tests, observations of behavior through generations and across different environments, as well as all of our understanding of neuroscience and brain structure.
Sure, I was speaking to the argument from ignorance being made; carnap was dismissing behavioral cues with speculation in the same way people have of other races in the past, and how solipsists do. There's always an eleborate and unfalsifiable explanation you can make up to dismiss the mental reality of others, and it's a bad idea.

We have overwhelming evidence through things like operant conditioning that macro-fauna are intelligent and exhibit true learning (which absolutely indicates self awareness somewhere on the spectrum). That's the best smoking gun you'll ever find.
Sapientist wrote: Sat Jul 21, 2018 3:43 pmAgreed, but killing does not equate to suffering. Killing a being without pain or fear that is not self-aware is not any kind of suffering, far as I can see.
I would say it's reasonable to eat things like oysters and worms that we can be reasonably sure have no interests.

However, assuming that about larger animals (which are typically farmed) because self-awareness is not as complete as most humans puts you in a tricky position of giving theists lower moral value due to their significantly lower level of self awareness compared to atheists.

If Atheists can eat Christians without moral qualms (provided they kill them quickly and painlessly) then you might have an argument there. ;)

Re: Sentience is meaningless. Sapience is what matters.

Posted: Sat Jul 21, 2018 8:32 pm
by Sapientist
brimstoneSalad wrote: Sat Jul 21, 2018 7:16 pm 0 self awareness means 0 sentience.
I'm sorry, but that's just wrong, or you are using your own definitions of these words.

You cannot be self-aware without being sentient, but you can certainly be sentient without being self-aware.
The key there is the processing of the sense input: what are they processing it with respect to?
See my roomba example.
For a Darwinian creature, they're processing it relative to their selves; their place in the environment, and to determine what they should do.
That doesn't require self-awareness.
I don't think you can get any meaningful processing without some implied and very crude sense of self, even if that is just a sense of relative position (which is the rudiments of a sense of self; one of many parts of the full picture).
Having an awareness of ones body is not the same as having a sense of self the way it is being used in this discussion. A roomba can know the dimensions if oitself, and yet is not self-aware.
Most in what sense? If you're talking worms and plankton (which probably outnumber the rest of us) then that may be true.
But if you're talking macrofauna then you are certainly wrong.
Most as in most, the majority of animals.
Experiments have proved operant conditioning (true learning, definitely not pre-programmed) in higher animals. Insects are more in the grey area, there has been indication of true learning in many insects too (even if quite crude).
Experiments have shown basic conditioning, sure, but this doesn't require self-awareness as a prerequisite.
But they literally aren't fully self-aware, they're mistaken. If you want some complete self-awareness they lack it. They're "aware" of something that doesn't even exist. They aren't aware that they are primates evolved from simpler life forms with minds running on wetware that will vanish when they die. There's no self-awareness to any of that.
I don't think you understand the point that was made above. What you are saying literally does not make sense. To even consider any religious argument as in your example, requires self-awareness. That is indisputable.
No, you're just begging the question.
I'm really not. Your example is absurd on it's face. You're trying to argue that someone doesn't have a trait because of a certain action, when that action is only possible if they possess that trait in the first place.
Then please, define it. Explain how a Christian is fully and equally self aware when that person doesn't really know what he or she is.
See my previous point.
We have overwhelming evidence through things like operant conditioning that macro-fauna are intelligent and exhibit true learning (which absolutely indicates self awareness somewhere on the spectrum). That's the best smoking gun you'll ever find.
It's really not. Basic learning and conditioning is not evidence of self-awareness, at least to the extent I am talking about, that would allow someone to be aware of their own life and value it. They have a slight upgrade to the same operating system insects are running on, which allows means they have more complicated if-then-else conditionals. They still rely on pre-programmed instinct, and are still essentially machines.
I would say it's reasonable to eat things like oysters and worms that we can be reasonably sure have no interests.
I don't really care about 'interests'. You could say bacteria have interests also. I care about self-awareness. If a being is not self-aware enough to be conscious of it's own life, then it has no right to it. Interests or not.
However, assuming that about larger animals (which are typically farmed) because self-awareness is not as complete as most humans puts you in a tricky position of giving theists lower moral value due to their significantly lower level of self awareness compared to atheists.
With respect, you are still not understanding the argument. Every single human is self-aware in the way that term is being used in this discussion, and their beliefs have literally no bearing on that level of self-awareness. The self-awareness that is innate to humans manifests at a lower level than whatever beliefs or ideas they use that self-awareness to come up with.

Re: Sentience is meaningless. Sapience is what matters.

Posted: Sat Jul 21, 2018 9:07 pm
by Cirion Spellbinder
Sapientist wrote: Sat Jul 21, 2018 1:07 pmI take issue with this. The term 'subjective experience' is not mentioned in the dictionary at all, and that it is implied I think is a contention not directly supported.
Right, but many words are redundant and others correspond to other words. It’s fine to call a goldfish a “type of fish” or “a variety of limbless cold-blooded vertebrate animal with gills and fins, living wholly in water.” You’re saying the same thing.
Sapientist wrote:Would you say an advanced model of roomba can have a subjective experience? It can learn, make decisions, gets tired, knows to recharge (and won't work if it doesn't), performs sub-optimally when on low power...
Firstly, roombas do not, in any meaningful sense of the word “get tired.” Performing poorly on low battery and needing to recharge don’t imply getting tired and are wholly programmed responses. The roomba is not “experiencing” anything, it just has physical limits and a program that says do A if B, C if D, and so on. The roomba didn’t “learn” to do these things any more than a boulder learns to fall, slow down due to friction, or shatter when pushed to its limits.
Sapienist wrote:You might think this comparison is ridiculous, but I see no difference between this really and a simple animal with relatively few neurons. Both are, in my view, essentially just machines following programming (well, one certainly is).

It’s unfortunate then that a roomba has no neurons and not all animals have “relatively few neurons.”
Merriam Webster wrote:responsive to or conscious of sense impressions
Sapientist wrote:Exactly as i said. Enough of a mind to use senses. That's it.
Your definition is needlessly vague. What is meant by “using senses” is not clear at all and could very well be “forming a subjective experience.”
Sapientist wrote:It has senses (it can react tot hings, e.g. bumping into a table and making a decision based on that), and it has consciousness as you use it (awareness of input..it is aware if it has bumped into something).
Roombas do not make decisions or have any awareness of their input, they execute programs. There is no mind here, like a boulder rolling down a hill. Or are boulders also sentient, after all, they “decide” to execute the laws of physics in a manner contingent to their circumstances. Electrical switches must also be sentient, as they are “conscious” of their on off stare. A neural network obviously isn’t necessary to facilitate any of this.
Sapientist wrote:If I have something that I am not and never will be aware of, and you take it from me, what is the harm?
Because you care about your current state, no matter if you can reflect about the fact that it is yours.
Sapientist wrote:Interesting point. I would say that sapience is woven into the fabric of the mind in a way that is impossible for non-sapient animals.
You have no evidence of this. How do you even know I am sapient?
Sapientist wrote:Our self-awareness may be dulled and drowned out by the physical pain and instinctual reaction, but 'we' are still there, observing, watching and making decisions....reacting in a way that a simple sentient being is incapable of.
Except you’ve stated earlier that a roomba (apparently a sentient being) is capable of observing and watching (awareness of its environment) and decisions (like you said, it apparently chooses to turn at walls).

Re: Sentience is meaningless. Sapience is what matters.

Posted: Sat Jul 21, 2018 9:14 pm
by Cirion Spellbinder
@brimstoneSalad, would it be better if we just said that capacity for preferences was the condition for moral consideration so people don’t have to make the inference from sentience? It seems much easier just to explain why preferences matter than to explain why sentience implies preferences and preferences matter.

Re: Sentience is meaningless. Sapience is what matters.

Posted: Sat Jul 21, 2018 9:45 pm
by Sapientist
Cirion Spellbinder wrote: Sat Jul 21, 2018 9:07 pm Right, but many words are redundant and others correspond to other words. It’s fine to call a goldfish a “type of fish” or “a variety of limbless cold-blooded vertebrate animal with gills and fins, living wholly in water.” You’re saying the same thing.
Right, but as I said I think your definition is addong extra stuff on to the common language definition, rather than being paraphrasing it.
Firstly, roombas do not, in any meaningful sense of the word “get tired.” Performing poorly on low battery and needing to recharge don’t imply getting tired and are wholly programmed responses.
When low on power, it acts sluggishly. That's pretty much spot on behavior for getting tired.
The roomba is not “experiencing” anything, it just has physical limits and a program that says do A if B, C if D, and so on. The roomba didn’t “learn” to do these things any more than a boulder learns to fall, slow down due to friction, or shatter when pushed to its limits.
The roomba has senses, and makes decisions based on input frmo those senses, so I think it is experience something at least as much as a simple insect does.

And it does learn, it's not just basic conditionals, but actual learning using AI. It's pretty advanced. There's a reason it costs ~$1000.
It’s unfortunate then that a roomba has no neurons and not all animals have “relatively few neurons.”
Of course not all animals have relatively few neurons, but many do. And the lack of neurons is irrelevant. neurons are one means to an end...the more progress we make with AI should indicate neurons are not a necessary ingredient to sentience or self-awareness.
Your definition is needlessly vague. What is meant by “using senses” is not clear at all and could very well be “forming a subjective experience.”
Except it's not my definiton, it's literally the dictionary definition without adding anything on top of it.

It's not that it's vague, it's that you've been using it incorrectly all this time.
Roombas do not make decisions or have any awareness of their input, they execute programs.
They certainly do make decisions and have awareness of input. Again, there is a reason they are so expensive. Go take a look at the feature page, they are a lot more advanced then you seem to realize.
There is no mind here, like a boulder rolling down a hill.
A boulder is not learning and making decisions. A roomba is comparable to a very simple insect, perhaps a fruit fly. hell, I'm pretty sure it is more advanced than a fruit fly.

For what it's worth, I consider a fruit fly to only be executing programs and not have a mind in any meaningful sense.

You are dismissing and mocking hte idea that a roomba is advanced to a simple insect, but you haven't actually made an argument against it, and some of what you stated was simply wrong (saying they can't learn, for example).
Because you care about your current state, no matter if you can reflect about the fact that it is yours.
I do as a human, I'm not convinced, say, a fruitfly does.
You have no evidence of this. How do you even know I am sapient?
Philosophical zombies is a nice thought experiment, but not especially relevant to this discussion. let's stay on topic.
Except you’ve stated earlier that a roomba (apparently a sentient being) is capable of observing and watching (awareness of its environment) and decisions (like you said, it apparently chooses to turn at walls).
You seem to think I am contradicting myself, but I don't see that I am at all. Could you clarify what you think I am?

Re: Sentience is meaningless. Sapience is what matters.

Posted: Sun Jul 22, 2018 3:14 am
by Cirion Spellbinder
Sapientist wrote:When low on power, it acts sluggishly. That's pretty much spot on behavior for getting tired.
Tiredness is not its associated behaviors, it is a state of mind and can give way to different sets of behaviors.
Sapientist wrote:And it does learn, it's not just basic conditionals, but actual learning using AI. It's pretty advanced. There's a reason it costs ~$1000.
Does it have a neural network (artificial or not) or does it just build a map or something?

Also, though it isn't really relevant to the discussion: high price =/= high quality
Sapientist wrote:Except it's not my definiton, it's literally the dictionary definition without adding anything on top of it.
Actually, like I cited before the dictionary definition is literally:
Merriam Webster wrote:responsive to or conscious of sense impressions
which is literally not:
not Merriam Webster wrote:Enough of a mind to use senses.
and that you've also claimed as your own:
Sapientist wrote:Exactly as i said. Enough of a mind to use senses.
Sapientist wrote:They certainly do make decisions and have awareness of input. Again, there is a reason they are so expensive. Go take a look at the feature page, they are a lot more advanced then you seem to realize.
Could you link it?
Sapientist wrote:I do as a human, I'm not convinced, say, a fruitfly does.
I don't really care about insects to be honest, even if they meet standards for moral consideration, they do so barely and with proportional consideration. Do you think that most animals are worthy of moral consideration?
Sapietnist wrote:Philosophical zombies is a nice thought experiment, but not especially relevant to this discussion. let's stay on topic.
You are correct.
Sapientist wrote:You seem to think I am contradicting myself, but I don't see that I am at all. Could you clarify what you think I am?
You said:
Sapientist wrote:Our self-awareness may be dulled and drowned out by the physical pain and instinctual reaction, but 'we' are still there, observing, watching and making decisions....reacting in a way that a simple sentient being is incapable of.
You listed (1) observing (2) watching and (3) making decisions as the reactions which differentiate us from non-self-aware beings, even in lapses of non-self-awareness. Yet you have described the apparently sentient (but not self-aware) roomba as still (1) observing, (2) watching, and (3) making decisions
Sapientist wrote:It has senses (it can react tot hings, e.g. bumping into a table and (3) making a decision based on that), and (1&2)it has consciousness as you use it (awareness of input..it is aware if it has bumped into something).
Therefore, you have said that what distinguishes us from the merely sentient in lapses of self-unawareness are observing, watching, and making decisions and also stated that the merely sentient also possess these capabilities (which means it is not unique or distinguishing).