• moonlight@fedia.io
    link
    fedilink
    arrow-up
    3
    ·
    21 hours ago

    Yes, you can run ollama via termux.

    Gemma 3 4b is probably a good model to use. 1b if you can’t run it or it’s too slow.

    I wouldn’t rely on it for therapy though. Maybe it could be useful as a tool, but LLMs are not people, and they’re not even really intelligent, which I think is necessary for therapy.