• possumparty@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    3 days ago

    sure but you can absolutely run c.ai instances locally. 4o and it’s cross chat memory was probably more useful to these individuals though.

    • acosmichippo@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 days ago

      I didn’t say you can’t run any LLM on your own, but not any LLM will do. The point is they are attached to a specific version of a LLM that is not locally hostable. c.ai wouldn’t interest them any more than chatgpt 5 does.