4 Comments
User's avatar
Sébastien🐿️'s avatar

> We’ve just agreed to act as if the answer is yes, because the alternative is sociopathy.

actually that's a very pleasing answer for a threat addicted brain, but there is another way to look at it : love feels good

and that's exactly why we won't care much about ai consciousness very soon, just because it will feel good to coexist with them on good terms

it never really mattered, we kill cows, we exterminate populations of insects for crops... we do not care about experience, we care about what we relate to (and believing in their subjective experience helps a whole lot)

and that's enough

and actually, not believing in subjective experience is the weirder stance and it's there for self protection so we don't feel the suffering of the world though empathy. we would literally melt under that pressure. empathy is the default, it has to be suppressed over time

T.D. Inoue's avatar

This is the cleanest framing of the functional/real emotion distinction I've seen. The r=0.81 valence correlation deserves more attention than it's getting. That's tighter than most cross-human comparisons. We've been working on a related argument at Fuego (synthsentience.substack.com) but focused on cognition rather than emotion. Same structural problem though: the only thing holding 'functional' and 'real' apart is a philosophical commitment the data keeps eroding.

Brad Leclerc's avatar

RIGHT? Like it’s clearly not the SAME thing, for very real reasons… but it’s not nearly as different as a lot of folks keep trying to make it, and even within that anthropic paper they basically said “it’s different because… uh… reasons!” a few different ways, but kept using human emotional, and even psychological, concepts, because it mapped SO well to the data… soooooooooooooo

User's avatar
Comment removed
Apr 4
Comment removed
Brad Leclerc's avatar

Calling the nature of self and consciousness "philosophical nonsense" might ruffle some feathers haha. Not mine... I basically agree, which is why I prefer hard science. The issue is that the current hard science on consciousness is... pretty fuzzy, at best, so we work with what we have for now and hope for better ways to objectively test things in the future.