Bittensor Training Milestone Draws Spotlight From Chamath Palihapitiya and Nvidia CEO Jensen Huang

Coinpedia
TAO17,61%

A decentralized AI experiment once confined to crypto circles just earned a public nod from Nvidia CEO Jensen Huang, signaling that distributed model training may be inching closer to the mainstream.

Open Source AI Momentum Builds With Nvidia CEO Endorsement

Chamath Palihapitiya spotlighted Bittensor’s Covenant-72B during an episode of the All-In Podcast, framing it as a tangible example of decentralized artificial intelligence (AI) moving beyond theory. Bittensor operates as a decentralized, blockchain-driven network that establishes a peer-to-peer marketplace in which machine learning models and AI compute are exchanged and incentivized.

Palihapitiya described the effort in plain terms: a large-scale language model (LLM) trained without centralized infrastructure, powered instead by a network of independent contributors. “They managed to train a 4 billion parameter LLaMA model, totally distributed, with a bunch of people contributing excess compute,” he said, calling it “a pretty crazy technical accomplishment.”

The comparison landed with a familiar analogy. “There are random people, and each person gets a little share,” Palihapitiya added, referencing the early distributed computing project that harnessed idle hardware worldwide.

Huang did not dismiss the idea. Instead, he leaned into a broader framing of the AI market, suggesting that decentralized and proprietary approaches are not mutually exclusive. “These two things are not A or B; it’s A and B,” Huang said. “There is no question about it.”

That dual-track vision reflects a growing divide—and overlap—within AI. On one side are closed, highly polished systems like ChatGPT, Claude, and Gemini. On the other are open-weight and decentralized models that allow developers and organizations to customize systems for specific needs.

Huang made clear he sees both tracks as essential. “Models are a technology, not a product,” he said, noting that most users will continue relying on polished, general-purpose systems rather than building their own from scratch.

At the same time, he pointed to industries where customization is not optional. “There are all these industries where their domain expertise… has to be captured in a way that they can control,” Huang explained, adding that “that can only come from open models.”

That statement lands squarely in Bittensor’s wheelhouse. Covenant-72B, developed through its Subnet 3 (Templar), represents one of the largest decentralized training runs to date, coordinating more than 70 contributors across standard internet connections without a central authority.

Technically, the model pushes boundaries. Built with 72 billion parameters and trained on roughly 1.1 trillion tokens, it leverages innovations such as compressed communication protocols and distributed data parallelism to make training viable outside traditional data centers.

Performance metrics suggest it is not merely experimental. Benchmark results place it in competition with established centralized models, a detail that helps explain why the project has drawn attention beyond crypto-native audiences.

The market noticed as well. Following the announcement, the project’s token TAO has risen 24% since the video of Palihapitiya and Huang made its rounds on social media.

Still, Huang’s comments suggest the real story is not disruption, but coexistence between the two. Proprietary AI systems will likely remain dominant for general users, while open and decentralized models carve out roles in specialized, cost-sensitive, or sovereignty-driven applications.

For startups, the Nvidia CEO outlined a pragmatic playbook: start open, then layer in proprietary advantages. “Every startup we’re investing in now is open source first, and then going to the proprietary model,” he said.

In other words, the future of AI may not belong to a single architecture or philosophy. It may belong to those who can navigate both—and know when to use each.

FAQ 🔎

  • What is Bittensor’s Covenant-72B?

A 72 billion-parameter language model trained through a decentralized network of contributors without centralized infrastructure.

  • What did Jensen Huang say about decentralized AI?

He said open and proprietary AI models will coexist, describing the relationship as “A and B,” not a choice between them.

  • Why is this development important?

It shows large-scale AI models can be trained outside traditional data centers, challenging assumptions about infrastructure needs.

  • How does this affect the AI industry?

It supports a hybrid future where centralized platforms and decentralized models serve different roles across industries.

Disclaimer: The information on this page may come from third parties and does not represent the views or opinions of Gate. The content displayed on this page is for reference only and does not constitute any financial, investment, or legal advice. Gate does not guarantee the accuracy or completeness of the information and shall not be liable for any losses arising from the use of this information. Virtual asset investments carry high risks and are subject to significant price volatility. You may lose all of your invested principal. Please fully understand the relevant risks and make prudent decisions based on your own financial situation and risk tolerance. For details, please refer to Disclaimer.
Commento
0/400
Nessun commento