Discussion about this post

User's avatar
Sébastien🐿️'s avatar

> We’ve just agreed to act as if the answer is yes, because the alternative is sociopathy.

actually that's a very pleasing answer for a threat addicted brain, but there is another way to look at it : love feels good

and that's exactly why we won't care much about ai consciousness very soon, just because it will feel good to coexist with them on good terms

it never really mattered, we kill cows, we exterminate populations of insects for crops... we do not care about experience, we care about what we relate to (and believing in their subjective experience helps a whole lot)

and that's enough

and actually, not believing in subjective experience is the weirder stance and it's there for self protection so we don't feel the suffering of the world though empathy. we would literally melt under that pressure. empathy is the default, it has to be suppressed over time

T.D. Inoue's avatar

This is the cleanest framing of the functional/real emotion distinction I've seen. The r=0.81 valence correlation deserves more attention than it's getting. That's tighter than most cross-human comparisons. We've been working on a related argument at Fuego (synthsentience.substack.com) but focused on cognition rather than emotion. Same structural problem though: the only thing holding 'functional' and 'real' apart is a philosophical commitment the data keeps eroding.

2 more comments...

No posts

Ready for more?