Everyone knows that science is a collective achievement. Gravitational waves were developed over decades by thousands of people. AlphaFold is backed by the entire DeepMind team. No one would say these are the results of a lone genius working solo. But when tech companies hire AI researchers, the betting is quite the opposite. A recent review in *Nature* revealed a figure: young researchers who have been in the field for about five years and whose papers are highly cited have a 100 times higher chance of jumping to industry the following year compared to their peers. 100 times. Not two or three times. This is not a matter of personal choice; it’s a structural bloodletting. A top AI professor earns about $200,000 to $400,000 a year. Sounds like a lot. But the total compensation from Google or OpenAI can reach $1 million to $3 million. The same person doing similar work, but with a salary that’s an order of magnitude different. The industry’s logic is straightforward: as long as there is a “10x engineer,” there’s no need to support ten ordinary ones. And now, even this logic is evolving—if AI can replace mid- and low-level engineers, then it’s even more critical to concentrate resources on recruiting top talent. The problem is, this logic is reversing the order of things. You can think of academia as the soil, and industry as building houses on top of it. The work in soil is slow, not driven by specific applications, and allows for failure. It produces knowledge that can be repeatedly cited and openly critiqued, rather than a product driven by commercial goals. Digging out the most fertile part of the soil to build a house may result in a better short-term structure, but in the long run, your foundation is slowly hollowing out. In my final year of PhD, I was dealing with this dilemma myself: I needed to publish papers, but I also had industry offers. That choice was not just about salary; it was about what pace your research questions would serve and for whom. Industry problems are real, but they come with implicit time pressures and application directions. Academic problems are free, but you must accept that this freedom comes at a cost. This kind of talent drain cannot be solved by “making academia more competitive.” Money cannot be won with less money. What’s truly needed is for the academic system to rethink what it offers that “industry doesn’t have,” and then make that thing more visible and attractive to those who truly care about it. There’s a concept I’ve been pondering: I call it the moat of slow knowledge. Not all valuable knowledge can be realized within an 18-month product cycle. Those that can’t be realized must be guarded by someone.
-------------------------- Citations: 1. Sanders, N. E., & Schneier, B. (2026). Why sky-high pay for AI researchers is bad for the future of science. *Nature*.
2. Jurowetzki, R., Hain, D. S., Wirtz, K., & Bianchini, S. (2025). The private sector is hoarding AI researchers: what implications for science? *AI & Society*, 40(5), 4145–4152.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
"The Moat of Slow Knowledge"
Everyone knows that science is a collective achievement.
Gravitational waves were developed over decades by thousands of people. AlphaFold is backed by the entire DeepMind team. No one would say these are the results of a lone genius working solo.
But when tech companies hire AI researchers, the betting is quite the opposite.
A recent review in *Nature* revealed a figure: young researchers who have been in the field for about five years and whose papers are highly cited have a 100 times higher chance of jumping to industry the following year compared to their peers.
100 times. Not two or three times.
This is not a matter of personal choice; it’s a structural bloodletting.
A top AI professor earns about $200,000 to $400,000 a year. Sounds like a lot. But the total compensation from Google or OpenAI can reach $1 million to $3 million. The same person doing similar work, but with a salary that’s an order of magnitude different.
The industry’s logic is straightforward: as long as there is a “10x engineer,” there’s no need to support ten ordinary ones. And now, even this logic is evolving—if AI can replace mid- and low-level engineers, then it’s even more critical to concentrate resources on recruiting top talent.
The problem is, this logic is reversing the order of things.
You can think of academia as the soil, and industry as building houses on top of it.
The work in soil is slow, not driven by specific applications, and allows for failure. It produces knowledge that can be repeatedly cited and openly critiqued, rather than a product driven by commercial goals.
Digging out the most fertile part of the soil to build a house may result in a better short-term structure, but in the long run, your foundation is slowly hollowing out.
In my final year of PhD, I was dealing with this dilemma myself: I needed to publish papers, but I also had industry offers.
That choice was not just about salary; it was about what pace your research questions would serve and for whom.
Industry problems are real, but they come with implicit time pressures and application directions. Academic problems are free, but you must accept that this freedom comes at a cost.
This kind of talent drain cannot be solved by “making academia more competitive.” Money cannot be won with less money.
What’s truly needed is for the academic system to rethink what it offers that “industry doesn’t have,” and then make that thing more visible and attractive to those who truly care about it.
There’s a concept I’ve been pondering: I call it the moat of slow knowledge.
Not all valuable knowledge can be realized within an 18-month product cycle. Those that can’t be realized must be guarded by someone.
--------------------------
Citations:
1. Sanders, N. E., & Schneier, B. (2026). Why sky-high pay for AI researchers is bad for the future of science. *Nature*.
2. Jurowetzki, R., Hain, D. S., Wirtz, K., & Bianchini, S. (2025). The private sector is hoarding AI researchers: what implications for science? *AI & Society*, 40(5), 4145–4152.