Gate Square “Creator Certification Incentive Program” — Recruiting Outstanding Creators!
Join now, share quality content, and compete for over $10,000 in monthly rewards.
How to Apply:
1️⃣ Open the App → Tap [Square] at the bottom → Click your [avatar] in the top right.
2️⃣ Tap [Get Certified], submit your application, and wait for approval.
Apply Now: https://www.gate.com/questionnaire/7159
Token rewards, exclusive Gate merch, and traffic exposure await you!
Details: https://www.gate.com/announcements/article/47889
2026 is consolidating into 3 really big bets where if one fails would result in a short term collapse of the bubble
1. data centers
anthropic, Xai scaling aggressively to 1M gpus. both plan to hit that target by Q2. the resulting models trained on this infrastructure needs to confirm compute scaling law is still effective. otherwise will indicate we need to explore a new model architecture. this is also dependent on energy infrastructure (power shell) being scaled though that’s going to be a problem for the next decade let alone this year.
2. memory
we only have enough memory this year to support 15GW worth of gpus while the hyperscalers are targeting 30-40 over the next 2 years. we’re cutting it kinda close here - if samsung, $MU or sk hynix fck up their production then this will stall bet number 1. also whoever secures majority of this memory capacity to create gpus wins the most within the constrained production capacity and right now that’s $NVDA
3. cheaper, more effective infra
nvidias latest gpu vera rubin is 5X more performant but importantly 4X more compute efficient meaning the inference costs have tanked - frontier intelligence for pennies! there will be an abundance of compute available to train but more importantly use ai this year, so jevons paradox needs to prove itself through rising demand. if that fails then the above two collapse. imo im least concerned about this one - i think agent harnesses and post training RL will continue to strengthen models.
very bullish