• MysteriousSophon21@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    Ollama is also a great option for running Mistral models locally - super lightweight and I’ve been running the mistral-7b on my MacBook without issues, it even integrates nicely with audiobookshelf if ur into that kind of self-hosted setup.