Request to Increase Gemini CLI Context Limit Beyond 1M Tokens #16067
Replies: 2 comments
-
|
@open54261 in a transformer architecture the cost of doing computations is |
Beta Was this translation helpful? Give feedback.
-
|
I recommend you try out the anti-gravity ide, it's agent seems to be able to dive into large code bases quite easily while still using models with relatively small context windows like 1m tokens ;) |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello Gemini Team,
I hope you’re doing well.
I’m writing to raise a concern regarding the current context length limitation in Gemini, specifically the 1M token context. This same limit already existed in Gemini 2.5, and it has become a significant bottleneck—especially when using the Gemini CLI.
In practical development workflows, program files and repositories can become fairly large, and once they exceed a certain size, the Gemini CLI struggles or becomes unreliable. This severely limits its usefulness for real-world projects where full context across multiple files is essential.
Given the progress in model capabilities and the increasing complexity of modern codebases, I strongly believe that increasing the context window beyond 1M tokens would dramatically improve the developer experience and make Gemini far more competitive and practical for large-scale work.
Could someone from the Gemini support or engineering team please advise:
Whether there are plans to increase the context length beyond 1M tokens
If there are recommended workarounds or upcoming improvements specifically for handling large projects in the Gemini CLI.
I would also to add that the problem still occurs in gemini 3 too.
Thank you for your time and for the continued work on Gemini. I’d really appreciate any guidance or updates on this matter.
Beta Was this translation helpful? Give feedback.
All reactions