
For the past few years, the “AI revolution” has lived almost entirely in the cloud. If you wanted to do anything serious with a Large Language Model (LLM), you needed a massive data center, a subscription to a tech giant, or a high-end PC with an expensive Nvidia graphics card. However, a new announcement could radically change that reality in one key respect. Tether recently unveiled QVAC Fabric, an open-source infrastructure project designed to run and—more importantly—train AI models on consumer hardware.
QVAC Fabric: Tether defies big tech with on-device AI training for smartphones
Google‘s Gemini Nano models let you run local AI on your phone. However, you can’t change or “fine-tune” them with your own data via training. QVAC Fabric breaks this barrier and is the first framework that lets a regular smartphone learn from your own data. Do note that this is different from fetching your data from a company’s cloud servers (like the Gemini chatbot does). We are talking about training, a key stage during the development of AI platforms.
Training a massive AI model usually requires an incredible amount of memory and processing power. To bypass these hardware limitations, Tether’s engineering team combined two clever technologies: BitNet and LoRA.
What makes the magic possible
BitNet is a specialized architecture from Microsoft that compresses model data significantly, while LoRA (Low-Rank Adaptation) reduces the number of parameters that need to be trained by up to 99%. This translates into a reduction in memory usage of up to 90% for QVAC Fabric vs full-precision models.
By merging both techs, Tether has created a system that can fine-tune a billion-parameter model on a Samsung Galaxy S25 in about 80 minutes. For those using an iPhone 16, the team even successfully pushed the limits to fine-tune models with up to 13 billion parameters.
Perhaps the most significant part of this announcement is its “cross-platform” nature. Until now, this kind of advanced training was mostly locked to Nvidia’s specialized systems. QVAC Fabric opens the door for AMD, Intel, and mobile GPUs (like Adreno and Apple’s Bionic) to join the party for the first time.
A privacy win
The shift toward “on-device” training is also a big win for privacy. Currently, if you want an AI to help you summarize your medical records or draft emails based on your writing style, you often have to upload that sensitive data to a corporate server.
With Tether’s new framework, that data never has to leave your device. You can train the AI on your personal documents and private files locally. This “anti-cloud” approach ensures that your digital world remains yours alone, removing the need for API keys or monthly subscriptions to big-tech providers. The goal is to ensure that advanced AI doesn’t become a tool reserved only for those with “absurd amounts of resources,” as Tether CEO Paolo Ardoino notes.
The post Your Phone Can Now Train AI: Tether’s QVAC Fabric Changes Everything appeared first on Android Headlines.