> Sentience broadly (& naively) covers the ability to independent thinking, rationalize outcomes, understand fear/threat, understand where it is wrong (conscience) and decide based on unseen information & understand what it doesn't know.
This seems to me a rather anthropomorphic definition. It seems as though it could be entirely possible for a system to lack these qualities and yet have sentience, or vice versa. The qualities you pointed to are seen in humans (and other creatures) because of evolutionary pressures that make them advantageous (coordinate among groups), but none of them actually depend on sentience (looking at it neurologically it would indeed be hard to imagine how such a dependency would be possible).
Looking at behavior and attempting to infer an internal state is a perilous task that will lead us astray here as we develop more complex systems. The only way to prove sentience is to prove the mechanism by which it arises. Otherwise we will continually grasp at comparisons as poor proxies for actual understanding.
This seems to me a rather anthropomorphic definition. It seems as though it could be entirely possible for a system to lack these qualities and yet have sentience, or vice versa. The qualities you pointed to are seen in humans (and other creatures) because of evolutionary pressures that make them advantageous (coordinate among groups), but none of them actually depend on sentience (looking at it neurologically it would indeed be hard to imagine how such a dependency would be possible).
Looking at behavior and attempting to infer an internal state is a perilous task that will lead us astray here as we develop more complex systems. The only way to prove sentience is to prove the mechanism by which it arises. Otherwise we will continually grasp at comparisons as poor proxies for actual understanding.