• Grail@multiverse.soulism.net
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      7
      ·
      24 hours ago

      You’re a probability model. Your brain is just spitting out an approximation of the most likely actions to get you food and sex. If you don’t get enough food and sex, your genes die out and evolution tries again with an iteration of a more successful model. All those neurons are just a fancy way of calculating how to eat more bananas and chase more poontang. You’re nothing more than a mathematical equation for reproduction.

      • daannii@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        4
        ·
        18 hours ago

        Nope. We aren’t. Infact humans don’t work like that at all.

        It’s actually amazing we ever learned actual probability math since it goes against our nature.

        Here is an example.

        I flip a coin 10x. It lands on tails all 10x.

        I will believe that the next flip almost certainly has to be heads.

        Just has to be. Right ?

        Wrong.

        It has a 50/50 chance. Just like all the other flips.

        The previous flips have no impact on future flips. The coin does not remember.

        Yet humans will believe the likelihood of a heads has increased with every tail flip.

        When it has not.

        • Grail@multiverse.soulism.net
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          13 hours ago

          Yeah, LLMs are prone to similar biases because they are also bad at conceptualising probability. They’re not calculators, they’re ANNs. They basically operate on dream-logic. Whatever makes sense to you in a dream, is likely to make sense to an LLM. That’s because dreams are a time when you have intelligence without consciousness, like an LLM. You’re super suggestible and you just go with whatever feels right based on your gut instincts. An LLM is a simulation of a person’s gut instincts.

          • daannii@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            2
            ·
            8 hours ago

            Yeah the prefrontal cortex is offline when you dream so you don’t question anything.

            Your example is a pretty good analogy. I’m saving it. 👍

            • Grail@multiverse.soulism.net
              link
              fedilink
              English
              arrow-up
              1
              ·
              4 hours ago

              Ever notice how you letters don’t work right in dreams? If you look at writing in a dream, you just know what it means without having to analyse the letters. But if you try to study the letters, they swim around like a stable diffusion image. LLMs deal in tokens, not letters. The approximate meaning of each token is learned during the training phase, so the LLM has a gut feeling of how the token should be used. But it doesn’t know how to spell the token, which is why they can’t tell you how many Rs are in the word Strawberry. Asking an LLM about spelling is like asking a dreamer about spelling. There’s no spelling in dreams, just raw meaning.

      • [deleted]@piefed.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        24 hours ago

        If that was true then consent is meaningless because people are just predictive models with no agency to give consent.

        Maybe your comparison is terrible?

        • Grail@multiverse.soulism.net
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          6
          ·
          23 hours ago

          consent is meaningless because people are just predictive models

          This is true if one maintains the assumption that predictive models (such as people) can’t experience qualia such as pain. My intend was to disabuse you and daannii of this silly notion. Obviously mathematical models can experience pain, because you’re a mathematical model and you can experience pain.

          • [deleted]@piefed.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            22 hours ago

            Predictive models and other computing processes do not have feelings or sensations because they don’t have nerves or any other senses. They are a complex process that has output based on input, but like a magic 8 ball with no agency or thought process.

            Reducing biology to just cause and effect is like saying rivers and oceans are the same thing because they both involve moving water, ignoring literally everything else that makes them different.

            • Grail@multiverse.soulism.net
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              6
              ·
              22 hours ago

              Predictive models are perfectly capable of having nerves and senses. You, for instance. You’re a predictive model and you have nerves and senses.

              Also, what’s this “nerves or any other senses”? What kind of sense doesn’t come through a nerve? I’m starting to think you don’t know as much as I do about neuroscience.