Local AI
In this series, I would explore way to run different AI models ranging from LLMs to Stable Diffusion models locally on all sorts of consumer hardware, like android phones, iPads and Consumer Laptops or Desktops
3 posts
Top 4 ways to Run LLM locally on Android and iOS In my previous blog, I explored the technical rabbit hole of running Llama.cpp in Termux on an old A... Read post

Running 24/7 Local AI on an Old Android without Overheating Last week I wrote about repurposing an old Android phone to run local AI models. In this follow-up I... Read post

How to Run LLM Models on Old Android Devices Locally This post covers a much more technical and involving way to run local LLMs on android via a terminal... Read post
