When I tried it in the past, I kinda didn’t take it seriously because everything was confined to its instance, but now, there’s full-featured global search and proper federation everywhere? Wow, I thought I heard there were some technical obstacles making it very unlikely, but now it’s just there and works great! I asked ChatGPT and it says this feature was added 5 years ago! Really? I’m not sure how I didn’t notice this sooner. Was it really there for so long? With flairs showing original instance where video comes from and everything?

  • mesa
    link
    fedilink
    445 days ago

    Chatgpt is wrong BTW. But yeah its been there for a long time.

    • @[email protected]
      link
      fedilink
      English
      575 days ago

      Why the fuck do people ask ChatGPT for shit like this? ChatGPT doesn’t know facts. It’s a magic 8-ball with more words.

      • @[email protected]
        link
        fedilink
        English
        215 days ago

        Asking chatgpt can be super useful to get info. I just don’t understand why people don’t try to verify what it says before just re-posting like fact.

        • @[email protected]
          link
          fedilink
          English
          155 days ago

          If you are just going to verify the info, why not just find out yourself and save yourself some time?

          • Null User Object
            link
            fedilink
            English
            14
            edit-2
            5 days ago

            It depends on what info you’re trying to find.

            I was recently trying to figure out the name of a particular uncommon type of pipe fitting. I could describe what it looked like, but had no idea what it was called. I described it to chatgpt, which gave me a name, which I could then search for with a normal search engine to confirm that the name was correct. Sure enough, search results took me to plumbing supply companies selling it, with pictures that matched what I described.

            But, asking it when a particular feature got added to a piece of software? There’s no additional information one would get from the answer to help them confirm that the answer is correct.

            ETA: The above strategy has also failed me many times, though, where chatgpt gives me information that follow-up searches only confirmed that chatgpt hallucinated the answer. Just wanted to say that to reinforce that you have to assume it’s hallucinating until you get independent confirmation.

            • Ulrich
              link
              fedilink
              English
              4
              edit-2
              5 days ago

              You should use something like perplexity instead that actually provides links to where it found the information. It will still make shit up but at least it’s easier to tell when it is.

        • @[email protected]
          link
          fedilink
          English
          205 days ago

          For basic fact checking like this, it’s basically useless. You’d have to go look it up to verify anyway, so it’s just an extra step. There’s use cases for it, but this isn’t it

          • Ulrich
            link
            fedilink
            English
            75 days ago

            Explain AI in 10 words or less:

            There’s use cases for it, but this isn’t it

        • @[email protected]
          link
          fedilink
          English
          65 days ago

          The only thing it’s useful at is shit that isn’t necessary.

          We had a P&Z member at the city I work at get butthurt because we corrected him at a meeting, so the city manager asked me to write an apology letter to him.

          That was the one time I loved ChatGPT. It was bullshit that didn’t need to happen that I didn’t care about and achieved nothing, so I let the fucking bot write it.

        • Ulrich
          link
          fedilink
          English
          105 days ago

          Why bother even using CGPT when you have to go elsewhere to verify everything it says anyway?

          • @[email protected]
            link
            fedilink
            English
            6
            edit-2
            4 days ago

            It depends on the type of facts, but sometimes it’s much easier to verify an answer than to get the answer in the first place. For example sometimes the LLM will mention a keyword that you didn’t know or didn’t remember and that makes googling much easier.

    • Ulrich
      link
      fedilink
      English
      105 days ago

      Chatgpt is wrong BTW

      LOL at this point I just assume that anytime someone cites it. It’s infuriating that people seem to think it knows dick about shit. Just mass disinformation, I guess.