![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
![]() |
|
加密貨幣新聞文章
Most Coding Will Be Done by AI Systems in a Few Years, Google DeepMind Gives Context
2025/05/05 11:08
Several tech leaders have been saying that we’ll see several jobs taken over by AI in the coming years, and among the jobs that are expected to be affected the most is coding. We’ve already seen promising demos of AI systems like AlphaCode and GPT-4 handling coding tasks, and it seems like this capability is advancing rapidly.
Now, Google DeepMind research scientist Nikolay Savinov has given some context on how that’ll actually happen. In a discussion on the DeepMind blog, Savinov predicts that we’ll soon see “superhuman coding systems” that will become indispensable tools for every coder, thanks to the advent of massive, readily available context windows.
The context window is the size of the input the model is able to ingest and process in one go. Currently, the best language models like GPT-4 and PaLM 2 have a context window of about 1 million tokens, which is enough for about 500 pages of text. This enables them to perform several tasks quite well, and as we move to even larger context windows, we’ll see an initial leap in quality and retrieval capabilities.
“I think it will take maybe a little bit more time, but it’s going to happen. As the cost decreases, the longer context also gets unlocked. So, I think reasonably soon we will see the 10 million context window. And this will be a standard offering from some AI provider. When this happens, that’s going to be a deal-breaker for some applications like coding,” said Savinov.
“Because I think for one or two million, you can only fit small and medium-sized codebases in the context. But 10 million actually unlocks large coding projects to be included in the context completely. And I think this is going to be crucial for achieving superhuman coding systems which will be totally unrivaled and become the new standard tool for every coder.”
This will also enable some truly interesting applications that we still aren’t thinking about yet. As Savinov puts it, when we reach near-perfect million-token context windows, it will “unlock totally incredible applications” that we can’t even imagine yet.
“But I think when we get to ten or hundred million tokens, we will be able to create truly general-purpose AI systems which will be able to perform any task that a human can do and even better,” added Savinov. “And I think this is really going to revolutionize our society and create new possibilities for humanity.”
Of course, this is still dependent on the researchers at DeepMind and other labs making progress on context windows, which Savinov says will require “more deep learning innovations.” We’re already seeing context windows increase rapidly, and as the cost of processing such large inputs comes down, we can expect to see even larger context windows in the coming years.
This is especially useful for use-cases like coding. If an application’s code can fit within the context window, the AI system can see all of the code at once, and can also see dependencies among different parts of the code. This allows it to make changes to the code while keeping in mind how different parts of the code work together.
A smaller context window, in contrast, means that the LLM can understand only a part of the code at once, and if it tries to fix it or change it, it could break other parts of the code which might be dependent on it. As such, when the AI system suggests changes in one part of the code, it can break other parts, and manually having to debug such code can often chip away at the advantages of using an AI system in the first place.
But if we get to 10 million context windows—the top AI systems today have context lengths of around 1 million—we’ll be able to fit entire codebases of complex software projects. This could revolutionize software development, enabling AI to generate code, debug, refactor, and even design entire systems with minimal human intervention. And it’s perhaps this upcoming increase in context window lengths that companies like Anthropic and Meta are thinking of when they say that most coding will be done by AI systems in the not-too-distant future.
免責聲明:info@kdj.com
所提供的資訊並非交易建議。 kDJ.com對任何基於本文提供的資訊進行的投資不承擔任何責任。加密貨幣波動性較大,建議您充分研究後謹慎投資!
如果您認為本網站使用的內容侵犯了您的版權,請立即聯絡我們(info@kdj.com),我們將及時刪除。
-
- 比特幣,戰爭和穩定:駕駛地緣政治風暴
- 2025-06-19 04:25:12
- 在戰爭和地緣政治緊張局勢中探索比特幣的韌性,從網絡戰到市場反應以及尋求穩定的追求。
-
-
- 比特幣供應緊縮:持有人霍德林,接下來會有100萬美元的BTC嗎?
- 2025-06-19 04:45:13
- 比特幣的供應減少,堅定的持有人定罪和潛在的機構FOMO:這是歷史性比特幣擠壓的完美秘訣嗎?
-
- 鯨魚的佩佩(Pepe)損失:即將發生的事情的跡象?
- 2025-06-19 04:45:13
- 一條主要的佩佩鯨魚以350萬美元的損失退出,但看漲的模式出現了。這是Meme Coin投資者的購買機會還是警告信號?
-
- 導航模因硬幣躁狂症:投資野外西部的投資組合策略
- 2025-06-19 04:35:13
- 模因硬幣正在爆炸,但是它們適合您的投資組合嗎?我們分解了模因硬幣市場中的趨勢,機會和風險。
-
-
-
-
- 美聯儲的“耐心”遊戲:解碼利率移動
- 2025-06-19 03:24:15
- 解碼美聯儲對利率的“耐心”戰略,其對加密貨幣的影響以及要觀看的關鍵經濟指標。係好,它將變得有趣。