• hark@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    16 hours ago

    That’s the P in ChatGPT: Pre-trained. It has “learned” based on the set of data it has been trained on, but prompts will not have it learn anything. Your past prompts are kept to use as “memory” and to influence output for your future prompts, but it does not actually learn from them.

    • Knock_Knock_Lemmy_In@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      14 hours ago

      The next generation of GPT will include everyone’s past prompts (ever been A/B tested on openAI?). That’s what I mean by generational learning.

      • hark@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        13 hours ago

        Maybe. It’s probably not high quality training data for the most part, though.