Skip to content

Conversation

@kogre
Copy link
Contributor

@kogre kogre commented Jan 28, 2026

Changelog

kogre and others added 2 commits January 27, 2026 22:42
Add a new method to classify multiple ideas in parallel, with LLM
requests parallelized but database operations kept sequential. This
avoids tenant/DB issues by pre-computing all prompts before spawning
threads.

Also fix Idea#input_topics association to use lft instead of ordering
(since InputTopic now uses acts_as_nested_set).

Co-Authored-By: Claude Opus 4.5 <[email protected]>
Update the job to process batches of ideas using the parallel
classification method. The classify_all_inputs_in_background! method
now batches ideas into groups of 24, and each job processes its batch
in parallel.

For single-idea classification (e.g., on idea create/update), the job
is called with an array of one idea ID.

Co-Authored-By: Claude Opus 4.5 <[email protected]>
@notion-workspace
Copy link

@cl-dev-bot
Copy link
Collaborator

Messages
📖 Changelog provided 🎉
📖 Notion issue: TAN-6354
📖

Run the e2e tests

📖 Check translation progress

Generated by 🚫 dangerJS against 724e498

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants