So Jensen Huang talked about decentralized AI training on the ALL-IN podcast and TAO pumped ALOT


Guess what, @0g_labs already proved it june 2025 (Almost a year ago) by releasing the DiLoCoX paper on arXiv for training AI models
They trained a 107 billion parameter model using 20 computers using only 1 Gbps internet
Compared to Bittensor Covenant 72B (which is a single trained model), DiLoCoX can train ANY model while making communication 357x more efficient
It has a full stack behind it: Compute, Storage, DA and Chain. Not a single project has all four layers
Team will speak abt decentralized AI in EthCC Cannes on 1 April, looking forward 🙌
TAO-6,46%
0G-3,62%
post-image
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin