• 0 Posts
  • 113 Comments
Joined 2 years ago
cake
Cake day: July 1st, 2023

help-circle

  • In an AI model collapse, AI systems, which are trained on their own outputs, gradually lose accuracy, diversity, and reliability. This occurs because errors compound across successive model generations, leading to distorted data distributions and “irreversible defects” in performance. The final result? A Nature 2024 paper stated, “The model becomes poisoned with its own projection of reality.”

    A remarkably similar thing happened to my aunt who can’t get off Facebook. We try feeding her accurate data, but she’s become poisoned with her own projection of reality.










  • My comment was in jest, but there is a reasonable argument that biological organisms are also predictive input/output machines. It’s especially evident in simple organisms, like an amoeba, where some physical or chemical stimulus in the environment triggers a mostly predictable response.

    The argument that human consciousness is fundamentally different - not just that it’s more complex but that at some point the physical determinism of electrical and chemical impulses gives way to an authority that overrides that physical basis, enabling free thought or free will - remains scientifically unsubstantiated. We know of no mechanism by which that could occur.

    And the philosophical arguments aren’t much better - I’ve never seen a theory of dualism articulated in a way that doesn’t invoke ghosts or magic.