• ikt@aussie.zone
    link
    fedilink
    English
    arrow-up
    7
    ·
    4 days ago

    if you want to run their models unregistered and unlimited you want to look into lm studio and others

    you can run the llm on your local machine

    • MysteriousSophon21@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      Ollama is also a great option for running Mistral models locally - super lightweight and I’ve been running the mistral-7b on my MacBook without issues, it even integrates nicely with audiobookshelf if ur into that kind of self-hosted setup.