• coldsideofyourpillow@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    7 hours ago

    You don’t need a background in coding at all. In fact, the spaces of machine learning and programming are almost completely seperate.

    1. Download Ollama.

    2. Depending on the power of your GPU, run one of the following commands:

      • DeepSeek-R1-Distill-Qwen-1.5B: ollama run deepseek-r1:1.5b

      • DeepSeek-R1-Distill-Qwen-7B: ollama run deepseek-r1:7b

      • DeepSeek-R1-Distill-Llama-8B: ollama run deepseek-r1:8b

      • DeepSeek-R1-Distill-Qwen-14B: ollama run deepseek-r1:14b

      • DeepSeek-R1-Distill-Qwen-32B: ollama run deepseek-r1:32b

      • DeepSeek-R1-Distill-Llama-70B: ollama run deepseek-r1:70b

    Bigger models means better output, but also longer generation times.