Stablecoin issuer Tether announced today (17th) a major technological breakthrough for its AI infrastructure QVAC Fabric: the world’s first support for a cross-platform BitNet LoRA fine-tuning framework, allowing large language models (LLMs) that previously required enterprise-grade GPUs and cloud computing power to now be trained and inferred on consumer-grade hardware, including smartphones.
Smartphones can now train LLMs: 1B models completed within 1 hour
According to data released by Tether, the framework has successfully achieved fine-tuning of BitNet models on various devices, including the common Samsung S25 and iPhone 16.
Samsung S25 (Adreno GPU):
iPhone 16 (Apple GPU):
Extreme tests have reached up to 13 billion parameters for model fine-tuning
Previously, AI training tasks executed on high-end NVIDIA GPUs have been compressed to edge devices like smartphones.
Key technology: BitNet + LoRA — slashing AI costs dramatically
The core of this breakthrough lies in the combination of two technologies:
BitNet (1-bit LLM)
Compresses traditional high-precision weights into just -1, 0, 1, significantly reducing memory and computation requirements.
LoRA (Low-Rank Adaptation)
Trains only a small number of parameters (reducing training volume by up to 99%), greatly lowering fine-tuning costs.
Together, these enable models to operate in extremely low-resource environments.
Empirical tests show BitNet-1B uses 77.8% less VRAM than Gemma-3-1B and 65.6% less than Qwen3-0.6B. Under the same hardware, roughly twice the model size can be run.
GPU unlocking mobile AI: performance boosted up to 11 times
Another key breakthrough of QVAC is enabling BitNet to run truly on “non-NVIDIA” ecosystems. It supports GPUs from AMD, Intel, Apple Silicon, and even mobile GPUs: Adreno, Mali, Apple Bionic.
Large language models are no longer the exclusive domain of tech giants; AI can now be decentralized
Tether CEO Paolo Ardoino stated: “Intelligence will be a key factor in future societal development. It has the potential to enhance social stability, serve as a bridge connecting society, or further empower a select few elites. The future of AI should be accessible, usable, and available to everyone, not monopolized by a few cloud service providers with enormous resources.”
Traditional AI development heavily relies on cloud and large GPU clusters, which are costly and concentrated among a few tech giants. Tether’s QVAC platform supports meaningful large-scale model training on consumer hardware, including smartphones, demonstrating that advanced AI can be decentralized and inclusive. In the coming months, Tether will continue investing significant resources and funds to ensure AI can be used anytime, anywhere on local devices.
This article: AI is no longer a patent of tech giants! Tether launches QVAC — the era of everyone having an LLM? Originally published on Chain News ABMedia.