Gate News, March 16 — According to official sources, the Bittensor subnet Templar (SN3) completed the largest decentralized LLM pretraining in history, Covenant-72B, on March 10. Covenant-72B is a language model with 72 billion parameters, pretrained by the Templar team on Bittensor Subnet 3, entirely based on the general internet, without centralized data centers. The model scored 67.1 on the MMLU (zero-shot) test, surpassing centralized baseline models like LLaMA-2-70B and LLM360 K2 under the same evaluation conditions. It is the largest fully permissionless collaborative language model to date, with over 70 different nodes contributing computing resources throughout its operation. The team has released all weights and checkpoints under the Apache license. Following this news, Bittensor (TAO) and its subnet tokens surged, with TAO increasing by 54.8% over the past two weeks. The subnet token Templar has risen 194% in the past 7 days, currently trading at $19.3.