• 0 Posts
  • 9 Comments
Joined 10 months ago
cake
Cake day: June 13th, 2024

help-circle






  • While it’s true that we don’t understand consciousness, LLMs don’t have the hallmarks of consciousness that humans and other animals do.

    Modern LLMs are essentially just guessing the next word in a sentence. There’s no continuous experience of the world, and there’s no self, no agency.

    Don’t me wrong, it’s fascinating tech, and if we do one day create machine consciousness, it might incorporate parts of our current technology. But right now it’s pretty safe to say that LLMs aren’t conscious, unless you believe in panpsychism / animism.