Mistral AI, the French company behind AI assistant Le Chat and several foundational models, is officially regarded as one of France’s most promising tech
Ollama is also a great option for running Mistral models locally - super lightweight and I’ve been running the mistral-7b on my MacBook without issues, it even integrates nicely with audiobookshelf if ur into that kind of self-hosted setup.
if you want to run their models unregistered and unlimited you want to look into lm studio and others
you can run the llm on your local machine
Ollama is also a great option for running Mistral models locally - super lightweight and I’ve been running the mistral-7b on my MacBook without issues, it even integrates nicely with audiobookshelf if ur into that kind of self-hosted setup.
Yeah, just run it with Ollama