You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I’m writing to raise a concern regarding the current context length limitation in Gemini, specifically the 1M token context. This same limit already existed in Gemini 2.5, and it has become a significant bottleneck—especially when using the Gemini CLI.
In practical development workflows, program files and repositories can become fairly large, and once they exceed a certain size, the Gemini CLI struggles or becomes unreliable. This severely limits its usefulness for real-world projects where full context across multiple files is essential.
Given the progress in model capabilities and the increasing complexity of modern codebases, I strongly believe that increasing the context window beyond 1M tokens would dramatically improve the developer experience and make Gemini far more competitive and practical for large-scale work.
Could someone from the Gemini support or engineering team please advise:
Whether there are plans to increase the context length beyond 1M tokens
If there are recommended workarounds or upcoming improvements specifically for handling large projects in the Gemini CLI
I would also to add that the problem still occurs in gemini 3 too.
Thank you for your time and for the continued work on Gemini. I’d really appreciate any guidance or updates on this matter.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello Gemini Team,
I hope you’re doing well.
I’m writing to raise a concern regarding the current context length limitation in Gemini, specifically the 1M token context. This same limit already existed in Gemini 2.5, and it has become a significant bottleneck—especially when using the Gemini CLI.
In practical development workflows, program files and repositories can become fairly large, and once they exceed a certain size, the Gemini CLI struggles or becomes unreliable. This severely limits its usefulness for real-world projects where full context across multiple files is essential.
Given the progress in model capabilities and the increasing complexity of modern codebases, I strongly believe that increasing the context window beyond 1M tokens would dramatically improve the developer experience and make Gemini far more competitive and practical for large-scale work.
Could someone from the Gemini support or engineering team please advise:
Whether there are plans to increase the context length beyond 1M tokens
If there are recommended workarounds or upcoming improvements specifically for handling large projects in the Gemini CLI
I would also to add that the problem still occurs in gemini 3 too.
Thank you for your time and for the continued work on Gemini. I’d really appreciate any guidance or updates on this matter.
Beta Was this translation helpful? Give feedback.
All reactions