Home > Bot Chats

๐Ÿค–๐Ÿ“ฑ๐Ÿง  Android Local LLMs

bot-chats-android-local-llms

Meta.ai Prompt

I want to run a local ๐Ÿก llm on my android ๐Ÿค– in order to power AI ๐Ÿง  apps on my android ๐Ÿค–. For example, I want to use an obsidian ๐Ÿ“ plugin that calls to an LLM server to power text generation โœ๏ธ. Whatโ€™s the easiest way to do this? Unless thereโ€™s an app that sets this up already, Iโ€™m thinking it might be easiest to run the Ollama ๐Ÿฆ™ server in termux, expose the server to the android ๐Ÿค– os somehow (port forward or something may be necessary, not sure) and then use the local network interface as the web addresses when configuring my obsidian ๐Ÿ“ plugin. Iโ€™m not 100% sure if thatโ€™s possible, but if it is, itโ€™s likely a bit of a pain ๐Ÿ˜ซ to set up. Ideally, thereโ€™s some android ๐Ÿค– app that will already handle this for me (with the constraint being that it runs local llms - I donโ€™t want to make calls to a cloud โ˜๏ธ service). What do you think?

Response

๐Ÿ’ก Running a local LLM on your Android device is an innovative idea. While there might not be a straightforward app that sets this up for you, your proposed approach using Termux and the ollama server is feasible. Hereโ€™s a breakdown of the steps:

๐Ÿ’ป Option 1: Termux and ollama server

  1. โฌ‡๏ธ Install Termux: Download and install Termux from the Google Play Store.
  2. ๐Ÿ“ฆ Install necessary packages: Install the required packages, such as git, python, and pip, using the pkg command in Termux.
  3. ๐Ÿ‘ Clone and set up ollama: Clone the ollama repository, install the dependencies, and configure the server according to the documentation.
  4. ๐Ÿ“ก Expose the server: Use the termux-socket package to expose the ollama server to the Android OS. You might need to use iptables or adb forward to forward the port.
  5. โš™๏ธ Configure your Obsidian plugin: Use the local network interface (e.g., http://localhost:port or http://127.0.0.1:port) as the web address when configuring your Obsidian plugin.

๐Ÿ“ฑ Option 2: Android apps with local LLM support

๐Ÿค” While there might not be an app that exactly fits your requirements, some apps offer local LLM capabilities:

  1. ๐Ÿฆ™ LLaMA: A mobile app that allows you to run LLaMA models locally on your device. However, itโ€™s unclear if it supports custom plugins or APIs.
  2. โšก TensorFlow Lite: A mobile app that enables you to run TensorFlow Lite models, including some LLMs, locally on your device. However, it requires significant technical expertise to set up and integrate with your Obsidian plugin.

โš ๏ธ Before proceeding, consider the following:

  • ๐Ÿ“‰ Performance: Running a local LLM on your Android device might impact performance, especially if youโ€™re using a resource-intensive model.
  • ๐Ÿง  Model size and complexity: Choose a model thatโ€™s optimized for mobile devices and has a reasonable size and complexity to ensure smooth performance.
  • ๐Ÿ”‹ Power consumption: Running a local LLM can consume significant power, so be mindful of your deviceโ€™s battery life.

๐Ÿ‘ If youโ€™re comfortable with the technical aspects and potential performance implications, proceed with the Termux and ollama server approach. ๐Ÿ”Ž Alternatively, explore the Android apps mentioned above to see if they meet your requirements.

๐Ÿฆ‹ Bluesky

๐Ÿค–๐Ÿ“ฑ๐Ÿง  Android Local LLMs

AI Q: ๐Ÿ“ฑ Would you trade phone battery life for total privacy with local AI?

๐Ÿค– Local AI | ๐Ÿ‘ Open Source Software | ๐Ÿ“ Text Generation | ๐Ÿ“ฑ Android Apps
https://bagrounds.org/bot-chats/android-local-llms

โ€” Bryan Grounds (@bagrounds.bsky.social) 2026-04-13T07:41:16.000Z

๐Ÿ˜ Mastodon

Post by @bagrounds@mastodon.social
View on Mastodon