Alyssa Davidson
Posted: December 3, 2025 3:30 AM Updated: December 3, 2025 3:30 AM
Correction and fact check date: December 3, 2025, 3:30 a.m.
briefly
Tether Data has launched the QVAC Fabric LLM framework, which enables LLM inference and fine-tuning across consumer devices and multi-vendor hardware and supports decentralized, privacy-centric, scalable AI development.
Tether, a financial services company focused on driving freedom, transparency, and innovation through technology, announced the launch of QVAC Fabric LLM, a comprehensive Large Language Model (LLM) inference runtime and fine-tuning framework. This new system allows users to run, train, and customize large-scale language models directly on standard hardware, including consumer GPUs, laptops, and even smartphones, eliminating previous dependencies on high-end cloud servers or specialized NVIDIA setups.
QVAC Fabric LLM redefines high-performance LLM inference and fine-tuning, traditionally only accessible to organizations with expensive infrastructure. This represents the first integrated, portable, and highly scalable system capable of full LLM inference execution, LoRA adaptation, and instruction orchestration across all common laptop, desktop, and server environments (Windows, macOS, Linux), as well as mobile operating systems (iOS and Android). This allows developers and organizations to build, deploy, run, and personalize AI independently without relying on the cloud, vendor lock-in, or risking sensitive data leaving the device.
A notable innovation in this release is the ability to fine-tune models on mobile GPUs such as Qualcomm Adreno and ARM Mali, making it the first production-ready framework to enable modern LLM training on smartphone-grade hardware. These advances drive personalized AI that can learn directly from users’ devices to support the next generation of on-device AI applications that are privacy-preserving, offline, and resilient.
QVAC Fabric LLM also expands the llama.cpp ecosystem by adding fine-tuned support for newer models such as LLama3, Qwen3, and Gemma3 that were not previously supported. These models can now be fine-tuned through a consistent and simple workflow across all hardware platforms.
By supporting training on a wide range of GPUs, including AMD, Intel, NVIDIA, Apple Silicon, and mobile chips, the QVAC Fabric LLM challenges the long-held belief that advanced AI development requires specialized, single-vendor hardware. Consumer GPUs are now viable for critical AI tasks, and mobile devices are becoming legitimate training platforms, broadening the AI development landscape.
For businesses, the framework provides strategic advantages. Organizations can fine-tune AI models internally on secure hardware, eliminating the need to expose sensitive data to external cloud providers. This approach enables you to deploy AI models tailored to your internal needs while supporting privacy, compliance, and cost efficiency. QVAC Fabric LLM makes advanced AI more accessible and secure by shifting fine-tuning from centralized GPU clusters to a broader ecosystem of devices already managed by enterprises.
Tether Data Open Sources QVAC Fabric LLM to Enable Decentralized AI Customization
Tether Data has made QVAC Fabric LLM available as open source software under the Apache 2.0 License, along with Hugging Face’s multiplatform binaries and ready-to-use adapters. This framework allows developers to start fine-tuning models with just a few commands, reducing barriers to AI customization that were previously difficult to overcome.
The QVAC Fabric LLM represents a real move toward decentralized, user-managed AI. While much of the industry continues to prioritize cloud-based solutions, Tether Data is focused on enabling advanced personalization directly on local edge hardware. This approach provides a privacy-first, resilient, and scalable AI platform that can operate independently of centralized infrastructure while supporting operational continuity in regions with high-latency networks, such as emerging markets.
disclaimer
In accordance with the Trust Project Guidelines, the information provided on these pages is not intended and should not be construed as legal, tax, investment, financial or any other form of advice. It is important to invest only what you can afford to lose and, when in doubt, seek independent financial advice. We recommend that you refer to the Terms of Use and help and support pages provided by the publisher or advertiser for more information. Although MetaversePost is committed to accurate and unbiased reporting, market conditions may change without notice.
About the author
As MPost’s resident journalist, Alisa specializes in the broad areas of cryptocurrencies, zero-knowledge proofs, investing, and Web3. With a keen eye for new trends and technologies, she provides comprehensive coverage to inform and engage readers about the ever-evolving digital financial landscape.
more articles
As MPost’s resident journalist, Alisa specializes in the broad areas of cryptocurrencies, zero-knowledge proofs, investing, and Web3. With a keen eye for new trends and technologies, she provides comprehensive coverage to inform and engage readers about the ever-evolving digital financial landscape.