• @[email protected]
    link
    fedilink
    English
    35 days ago

    I was talking about ai training on ai output, ai requires genuine data, having a feedback loop makes models regress, see how ai makes yellow pictures because of the ghibli ai thing

    • @[email protected]
      link
      fedilink
      English
      14 days ago

      Sure, that mainly applies when it’s the same model training on itself. If a model trains on a different one, it might retrieve some good features from it, but the bad sides as well

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          4 days ago

          If they weren’t trained on the same data, it ends up similar

          Training inferior models with superior models output can lower the gap between both. It’ll not be optimal by any means and you might fuck its future learning, but it will work to an extent

          The data you feed it should be good quality though