> In what ways do LLMs meet the definition of having consciousness and agency?
Agency: an ability to make decisions and act independently. Agentic pipelines are doing this.
Consciousness: something something feedback[1] (or a non-transferable feeling of being conscious, but that is useless for the discussion). Recurrent Processing Theory: A computation is conscious if it involves high-level processed representations being fed back into the low-level processors that generate it.
Tokens are being fed back into the transformer.
> that's a bit like looking at a bird in flight and imagining going to the moon: only tangentially related to engineering reality.
Is it? Vacuum of space is a tangible problem for aerodynamics-based propulsion. Which analogous thing do we have with ML? The scaled-up monkey brain[2] might not qualify as the moon.
Agency: an ability to make decisions and act independently. Agentic pipelines are doing this.
Consciousness: something something feedback[1] (or a non-transferable feeling of being conscious, but that is useless for the discussion). Recurrent Processing Theory: A computation is conscious if it involves high-level processed representations being fed back into the low-level processors that generate it.
Tokens are being fed back into the transformer.
> that's a bit like looking at a bird in flight and imagining going to the moon: only tangentially related to engineering reality.
Is it? Vacuum of space is a tangible problem for aerodynamics-based propulsion. Which analogous thing do we have with ML? The scaled-up monkey brain[2] might not qualify as the moon.
[1] https://www.astralcodexten.com/p/the-new-ai-consciousness-pa...
[2] https://www.frontiersin.org/journals/human-neuroscience/arti...