Need to Overcome Claude Conversation Length Limits (one chat conversation) -

Замовник: AI | Опубліковано: 23.10.2025
Бюджет: 30 $

I have ProPlan 20x still it hit this , quite oftne I rely on Claude for code generation, yet every time a project grows past a certain size the model refuses new turns because the total conversation length is maxed out. The result is broken context, lost instructions and a lot of manual copy-paste that kills productivity. I’d like a practical workaround that lets me keep coding large projects with Claude without running into the hard conversation cap. I’m open to any combination of prompt-chaining, automatic summarisation, external context storage, or an API-based relay that feeds only the relevant snippets back to the model. What matters is that I can: • feed Claude large, multi-file codebases (hundreds of kilobytes) • keep iterative back-and-forth going without losing context or exceeding the hard token/turn limit • preserve formatting so generated code can be copied straight into my repo Please deliver a small proof-of-concept tool (Python or Node preferred) along with clear setup instructions and a written explanation of how the method skirts the total-length restriction while staying within Anthropic’s policies. If you can show the concept on a public Claude sandbox or via screen-share, even better. I’ll test by running a sample project — about 10 K lines split over multiple files — through the workflow and confirming that I can still ask follow-up questions and receive usable code long after the normal limit would have blocked me. My Price is Max 10 USD.