• @[email protected]
    link
    fedilink
    2
    edit-2
    28 days ago

    Okay, what is your definition of AI then, if nothing burned onto silicon can count?

    If LLMs aren’t AI, then absolutely nothing up to this point probably counts either.

    • @[email protected]
      link
      fedilink
      1
      edit-2
      28 days ago

      since nothing burned into silicon can count

      Oh noo you called me a robot racist. Lol fuck off dude you know that’s not what I’m saying

      The problem with supporters of AI is they learned everything they know from the companies trying to sell it to them. Like a 50s mom excited about her magic tupperware.

      AI implies intelligence

      To me that means an autonomous being that understands what it is.

      First of all these programs aren’t autonomous, they need to be seeded by us. We send a prompt or question, even when left alone to its own devices it doesn’t do anything until it is given an objective or reward by us.

      Looking up the most common answer isn’t intelligence, there is no understanding of cause and effect going on inside the algorithm, just regurgitating the dataset

      These models do not reason, though some do a very good job of trying to convince us.

      • @[email protected]
        link
        fedilink
        228 days ago

        To me that means an autonomous being that understands what it is.

        A little thought experiment: How would you determine whether another human being understands what it is? What would that look like in a machine?

      • @[email protected]
        link
        fedilink
        1
        edit-2
        26 days ago

        you called me a robot racist.

        …what?

        Looking up the most common answer isn’t intelligence, there is no understanding of cause and effect going on inside the algorithm

        In order for that to be true, the entire dataset would need to be contained within the LLM. Which it is not. If it were, a model wouldn’t have to undergo training.

        AI implies intelligence

        You seem to be mistaking ‘intelligence’ for ‘human-like intelligence’. This is not how AI is defined. AI can be dumber than a gnat, but if it’s capable of making decisions based on stimulus without each set of stimulus and decision being directly coded into it, then it’s AI. It’s the difference between what is ACTUALLY called AI, and when a sci-fi show or novel talks about AI.

        • @[email protected]
          link
          fedilink
          125 days ago

          Yeah I’m not interested in how you put words in my mouth to fit the talking points you were hoping I would fall for

          We’re done here