Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughExtends the Mongo AST with many aggregation pipeline stages, replaces Changes
Sequence Diagram(s)sequenceDiagram
participant Dev as Developer (PipelineBuilder)
participant DSL as mongoPipeline DSL
participant AST as Query AST (MongoPipelineStage / MongoUpdateSpec)
participant Lower as Lowering (lowerStage / lowerPipeline)
participant Adapter as MongoAdapter
participant Mongo as MongoDB
Dev->>DSL: build pipeline or update spec
DSL->>AST: construct AggregateCommand / Update command with MongoPipelineStage/MongoUpdateSpec
AST->>Lower: provide pipeline / update spec
Lower->>Adapter: emit lowered command objects (documents / pipelines)
Adapter->>Mongo: execute command (aggregate / update with pipeline)
Mongo-->>Adapter: result
Adapter-->>Dev: runtime.execute result
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
@prisma-next/mongo-runtime
@prisma-next/family-mongo
@prisma-next/sql-runtime
@prisma-next/family-sql
@prisma-next/extension-paradedb
@prisma-next/extension-pgvector
@prisma-next/postgres
@prisma-next/sql-orm-client
@prisma-next/target-mongo
@prisma-next/adapter-mongo
@prisma-next/driver-mongo
@prisma-next/contract
@prisma-next/utils
@prisma-next/config
@prisma-next/errors
@prisma-next/framework-components
@prisma-next/operations
@prisma-next/contract-authoring
@prisma-next/ids
@prisma-next/psl-parser
@prisma-next/psl-printer
@prisma-next/cli
@prisma-next/emitter
@prisma-next/migration-tools
@prisma-next/vite-plugin-contract-emit
@prisma-next/runtime-executor
@prisma-next/mongo-codec
@prisma-next/mongo-contract
@prisma-next/mongo-value
@prisma-next/mongo-contract-psl
@prisma-next/mongo-emitter
@prisma-next/mongo-query-ast
@prisma-next/mongo-orm
@prisma-next/mongo-pipeline-builder
@prisma-next/mongo-lowering
@prisma-next/mongo-wire
@prisma-next/sql-contract
@prisma-next/sql-errors
@prisma-next/sql-operations
@prisma-next/sql-schema-ir
@prisma-next/sql-contract-psl
@prisma-next/sql-contract-ts
@prisma-next/sql-contract-emitter
@prisma-next/sql-lane-query-builder
@prisma-next/sql-relational-core
@prisma-next/sql-builder
@prisma-next/target-postgres
@prisma-next/adapter-postgres
@prisma-next/driver-postgres
commit: |
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
packages/2-mongo-family/4-query/query-ast/src/stages.ts (1)
614-639:MongoFacetStage.rewrite()always creates a new instance.Unlike other stages that return
thiswhen no changes occur,rewrite()always creates a newMongoFacetStage. Consider adding an early return when context has no applicable rewriters.♻️ Optional optimization to avoid unnecessary allocations
rewrite(context: MongoStageRewriterContext): MongoPipelineStage { + if (!context.aggExpr && !context.filter) return this; const newFacets: Record<string, ReadonlyArray<MongoPipelineStage>> = {}; for (const [key, pipeline] of Object.entries(this.facets)) { newFacets[key] = pipeline.map((stage) => stage.rewrite(context)); } return new MongoFacetStage(newFacets); }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/2-mongo-family/4-query/query-ast/src/stages.ts` around lines 614 - 639, The rewrite method on MongoFacetStage currently always allocates a new MongoFacetStage; change MongoFacetStage.rewrite to detect whether any child stage rewrites actually changed and return this when nothing changed: for each facet in this.facets, map stage -> rewritten = stage.rewrite(context) but track a boolean changed if rewritten !== stage (or pipeline length/order differs); only build newFacets when changed is true for any facet, and if no changes across all facets simply return this; otherwise construct and return new MongoFacetStage(newFacets).packages/2-mongo-family/4-query/query-ast/test/stages-extended.test.ts (1)
1-928: Test file exceeds 500-line guideline.At 928 lines, this file exceeds the 500-line limit. Consider splitting it into logical groups:
stages-extended.output.test.ts(MongoOutStage, MongoMergeStage)stages-extended.bucketing.test.ts(MongoBucketStage, MongoBucketAutoStage)stages-extended.geo.test.ts(MongoGeoNearStage, MongoGraphLookupStage)stages-extended.window.test.ts(MongoSetWindowFieldsStage, MongoDensifyStage, MongoFillStage)stages-extended.search.test.ts(MongoSearchStage, MongoSearchMetaStage, MongoVectorSearchStage)stages-extended.misc.test.ts(MongoUnionWithStage, MongoFacetStage, visitor tests)However, the current structure is cohesive and well-organized. This is a recommended refactor for maintainability.
As per coding guidelines: "Keep test files under 500 lines to maintain readability and navigability."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/2-mongo-family/4-query/query-ast/test/stages-extended.test.ts` around lines 1 - 928, The test file exceeds the 500-line guideline; split the large stages-extended.test.ts into multiple smaller test files by grouping related describe blocks — create e.g. stages-extended.output.test.ts (move describe('MongoOutStage') and describe('MongoMergeStage')), stages-extended.bucketing.test.ts (move describe('MongoBucketStage') and describe('MongoBucketAutoStage')), stages-extended.geo.test.ts (move describe('MongoGeoNearStage') and describe('MongoGraphLookupStage')), stages-extended.window.test.ts (move describe('MongoSetWindowFieldsStage'), describe('MongoDensifyStage'), describe('MongoFillStage')), stages-extended.search.test.ts (move describe('MongoSearchStage'), describe('MongoSearchMetaStage'), describe('MongoVectorSearchStage')), and stages-extended.misc.test.ts (move describe('MongoUnionWithStage'), describe('MongoFacetStage'), and describe('MongoStageVisitor')); ensure each new file imports the same helpers/types (e.g., MongoAggFieldRef, MongoAddFieldsStage, MongoMatchStage, MongoFieldFilter, prefixFieldRefRewriter) and update exports/tsconfig/test globs if needed so the test runner picks up the new files, then remove the moved describe blocks from the original file to bring it below 500 lines.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@packages/2-mongo-family/5-query-builders/pipeline-builder/test/builder.test.ts`:
- Around line 308-518: The file exceeds 500 lines because the entire "new stage
builder methods" suite was appended to builder.test.ts; split that describe
block into a new test file (e.g., builder.new-stages.test.ts) and keep
core/baseline builder tests in the original builder.test.ts. Move the whole
describe('new stage builder methods', ...) block (which references
createOrdersBuilder(), and the Mongo* stage classes like MongoRedactStage,
MongoOutStage, MongoMergeStage, MongoUnionWithStage, MongoBucketStage,
MongoBucketAutoStage, MongoGeoNearStage, MongoFacetStage, MongoGraphLookupStage,
MongoSetWindowFieldsStage, MongoDensifyStage, MongoFillStage, MongoSearchStage,
MongoSearchMetaStage, MongoVectorSearchStage) into the new file, update
imports/exports so createOrdersBuilder and those stage constructors are
available, and run tests to ensure names and setup (e.g., any shared fixtures)
are preserved.
---
Nitpick comments:
In `@packages/2-mongo-family/4-query/query-ast/src/stages.ts`:
- Around line 614-639: The rewrite method on MongoFacetStage currently always
allocates a new MongoFacetStage; change MongoFacetStage.rewrite to detect
whether any child stage rewrites actually changed and return this when nothing
changed: for each facet in this.facets, map stage -> rewritten =
stage.rewrite(context) but track a boolean changed if rewritten !== stage (or
pipeline length/order differs); only build newFacets when changed is true for
any facet, and if no changes across all facets simply return this; otherwise
construct and return new MongoFacetStage(newFacets).
In `@packages/2-mongo-family/4-query/query-ast/test/stages-extended.test.ts`:
- Around line 1-928: The test file exceeds the 500-line guideline; split the
large stages-extended.test.ts into multiple smaller test files by grouping
related describe blocks — create e.g. stages-extended.output.test.ts (move
describe('MongoOutStage') and describe('MongoMergeStage')),
stages-extended.bucketing.test.ts (move describe('MongoBucketStage') and
describe('MongoBucketAutoStage')), stages-extended.geo.test.ts (move
describe('MongoGeoNearStage') and describe('MongoGraphLookupStage')),
stages-extended.window.test.ts (move describe('MongoSetWindowFieldsStage'),
describe('MongoDensifyStage'), describe('MongoFillStage')),
stages-extended.search.test.ts (move describe('MongoSearchStage'),
describe('MongoSearchMetaStage'), describe('MongoVectorSearchStage')), and
stages-extended.misc.test.ts (move describe('MongoUnionWithStage'),
describe('MongoFacetStage'), and describe('MongoStageVisitor')); ensure each new
file imports the same helpers/types (e.g., MongoAggFieldRef,
MongoAddFieldsStage, MongoMatchStage, MongoFieldFilter, prefixFieldRefRewriter)
and update exports/tsconfig/test globs if needed so the test runner picks up the
new files, then remove the moved describe blocks from the original file to bring
it below 500 lines.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yml
Review profile: CHILL
Plan: Pro
Run ID: fbdccd59-0c70-494c-8a66-d556dd729475
⛔ Files ignored due to path filters (2)
projects/orm-consolidation/plans/tml-2212-remaining-stages-plan.mdis excluded by!projects/**projects/orm-consolidation/specs/remaining-stages-pipeline-style-updates.spec.mdis excluded by!projects/**
📒 Files selected for processing (15)
packages/2-mongo-family/4-query/query-ast/src/commands.tspackages/2-mongo-family/4-query/query-ast/src/exports/index.tspackages/2-mongo-family/4-query/query-ast/src/stages.tspackages/2-mongo-family/4-query/query-ast/src/visitors.tspackages/2-mongo-family/4-query/query-ast/test/stages-extended.test.tspackages/2-mongo-family/4-query/query-ast/test/stages.test-d.tspackages/2-mongo-family/5-query-builders/orm/src/compile.tspackages/2-mongo-family/5-query-builders/orm/test/collection.test.tspackages/2-mongo-family/5-query-builders/orm/test/compile.test.tspackages/2-mongo-family/5-query-builders/pipeline-builder/src/builder.tspackages/2-mongo-family/5-query-builders/pipeline-builder/test/builder.test.tspackages/3-mongo-target/2-mongo-adapter/src/lowering.tspackages/3-mongo-target/2-mongo-adapter/src/mongo-adapter.tspackages/3-mongo-target/2-mongo-adapter/test/lowering.test.tspackages/3-mongo-target/2-mongo-adapter/test/mongo-adapter.test.ts
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
test/integration/test/mongo/pipeline-builder.test.ts (2)
453-472: Assert the$mergeoutput, not just a minimum count.
toBeGreaterThanOrEqual(4)still passes if$mergeduplicates rows, misses a category, or overwrites the placeholder incorrectly. Please check the expected summary documents directly.Stronger assertion sketch
await exec(plan); const docs = await db.collection('summary').find().toArray(); - expect(docs.length).toBeGreaterThanOrEqual(4); + expect(docs).toHaveLength(4); + expect(docs).toEqual( + expect.arrayContaining([ + expect.objectContaining({ _id: 'placeholder', source: 'old' }), + expect.objectContaining({ _id: 'electronics', total: 1698 }), + expect.objectContaining({ _id: 'furniture', total: 400 }), + expect.objectContaining({ _id: 'stationery', total: 5 }), + ]), + );As per coding guidelines, "Avoid tautological tests that only restate fixture input - tests must verify behavior."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/integration/test/mongo/pipeline-builder.test.ts` around lines 453 - 472, The test currently only checks docs.length after running the pipeline built by products().group(...).merge(...).build() which is weak; update the test to assert the actual merged documents in the 'summary' collection match the expected summary results. After seed() and exec(plan), query db.collection('summary') and either (a) compute the expected summary (category totals) from the same seed fixture and assert the set of {_id: category, total: sum} equals the returned docs, or (b) assert specific expected documents exist and that the placeholder {_id: 'placeholder'} was removed/updated as intended; use the pipeline builder symbols (products(), group, merge, exec) and the 'summary' collection to locate and replace the toBeGreaterThanOrEqual(4) assertion with these stronger content checks.
1-519: Split this suite before it turns into a catch-all.This file is already over the 500-line limit and covers several distinct concerns (
basic,aggregation,write stages,complex pipelines). A small split now will keep failures easier to triage and make the next stage additions cheaper to review.As per coding guidelines, "Keep test files under 500 lines to maintain readability and navigability. If a test file exceeds this limit, it should be split into multiple files."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@test/integration/test/mongo/pipeline-builder.test.ts` around lines 1 - 519, The test file exceeds the 500-line guideline because multiple distinct concerns are combined; split it into smaller suites (e.g., basic-pipeline, aggregation, write-stages, complex-pipelines) by moving the relevant describe blocks (those titled "match + sort + limit + skip", "group with accumulators", "addFields", "project", "count", "unwind", "lookup", "facet", "bucket", "unionWith", "sortByCount", "sample", "out", "merge", "complex pipeline") into new test files; extract shared helpers/constants (pipeline factory, products(), orders(), PRODUCTS, ORDERS, seed(), exec(), describeWithMongoDB usage) into a test helper module or keep minimal per-file copies, update imports (mongoPipeline, acc, fn, Mongo*Stage classes) in each new file, and ensure the test bootstrap still registers all new test files with the runner.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@test/integration/test/mongo/pipeline-builder.test.ts`:
- Around line 497-508: The test case title says "lookup → unwind → group" but
the pipeline only calls orders().lookup(...) and pipes a MongoMatchStage, so
either rename the it(...) description to reflect the actual pipeline (e.g.,
"lookup with match: orders enriched with product category") or add the missing
stages to the pipeline by calling the unwind() and group() methods after
lookup() (and before build()) so the pipeline matches the title; locate the test
using the orders() builder and the MongoMatchStage / MongoFieldFilter.eq usage
to update the title or append calls to unwind() and group() to produce the
intended aggregation behavior.
---
Nitpick comments:
In `@test/integration/test/mongo/pipeline-builder.test.ts`:
- Around line 453-472: The test currently only checks docs.length after running
the pipeline built by products().group(...).merge(...).build() which is weak;
update the test to assert the actual merged documents in the 'summary'
collection match the expected summary results. After seed() and exec(plan),
query db.collection('summary') and either (a) compute the expected summary
(category totals) from the same seed fixture and assert the set of {_id:
category, total: sum} equals the returned docs, or (b) assert specific expected
documents exist and that the placeholder {_id: 'placeholder'} was
removed/updated as intended; use the pipeline builder symbols (products(),
group, merge, exec) and the 'summary' collection to locate and replace the
toBeGreaterThanOrEqual(4) assertion with these stronger content checks.
- Around line 1-519: The test file exceeds the 500-line guideline because
multiple distinct concerns are combined; split it into smaller suites (e.g.,
basic-pipeline, aggregation, write-stages, complex-pipelines) by moving the
relevant describe blocks (those titled "match + sort + limit + skip", "group
with accumulators", "addFields", "project", "count", "unwind", "lookup",
"facet", "bucket", "unionWith", "sortByCount", "sample", "out", "merge",
"complex pipeline") into new test files; extract shared helpers/constants
(pipeline factory, products(), orders(), PRODUCTS, ORDERS, seed(), exec(),
describeWithMongoDB usage) into a test helper module or keep minimal per-file
copies, update imports (mongoPipeline, acc, fn, Mongo*Stage classes) in each new
file, and ensure the test bootstrap still registers all new test files with the
runner.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yml
Review profile: CHILL
Plan: Pro
Run ID: 09575e8e-04ec-42de-bd2c-eba45d5e61e5
📒 Files selected for processing (5)
examples/mongo-demo/package.jsonexamples/mongo-demo/src/db.tsexamples/mongo-demo/src/server.tstest/integration/package.jsontest/integration/test/mongo/pipeline-builder.test.ts
✅ Files skipped from review due to trivial changes (2)
- test/integration/package.json
- examples/mongo-demo/package.json
There was a problem hiding this comment.
Actionable comments posted: 4
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/2-mongo-family/5-query-builders/pipeline-builder/src/builder.ts`:
- Around line 378-380: The search() and vectorSearch() methods currently widen
the builder's output type to DocShape, which loses the concrete row typing;
update both methods to preserve the original Shape generic instead of DocShape
by calling this.#withStage<Shape>(...) so the PipelineBuilder<TContract, Shape>
is maintained after adding a MongoSearchStage/MongoVectorSearchStage; leave
searchMeta() as-is (widened) and ensure the created stage types
(MongoSearchStage, MongoVectorSearchStage) are used with the Shape generic so
downstream DSL steps keep field-level typing.
- Around line 283-287: The unionWith method currently claims the output row
shape remains Shape which is unsound; change its signature to widen the
resulting Shape to reflect unknown/unspecified output (e.g. unionWith<NewShape =
unknown>(collection: string, pipeline?: ReadonlyArray<MongoPipelineStage>):
PipelineBuilder<TContract, Shape | NewShape>) or at minimum return
PipelineBuilder<TContract, Shape | Record<string, unknown>> so downstream stages
and build() don't assume the pre-union Shape; keep the same body and still call
this.#withStage(new MongoUnionWithStage(collection, pipeline)) but update the
generic return type to the widened Shape to preserve type-safety.
- Around line 272-276: The merge method's whenMatched parameter currently uses
ReadonlyArray<MongoPipelineStage>, which re-exposes the full 28-stage pipeline;
change its type to the narrower update-pipeline union introduced in the PR (the
same union used elsewhere for update pipelines) so whenMatched accepts
ReadonlyArray of that update-pipeline type (replace MongoPipelineStage reference
with the update-pipeline union type name). Update the signature for merge(...
on?: ..., whenMatched?: ...) to reference the update-pipeline type instead of
MongoPipelineStage to ensure the restricted stage surface is enforced.
- Around line 268-279: The out() and merge() methods currently return
PipelineBuilder and allow further .pipe() chaining, which permits invalid
pipelines; change their return types to a terminal builder type (e.g.,
TerminalPipelineBuilder or PipelineTerminator) instead of PipelineBuilder so
callers cannot append stages after write operations. Update the signatures of
out(collection: string, db?: string) and merge(options: {...}) to return that
terminal type, implement the terminal type as a thin wrapper produced by
`#withStage`<Shape>(...) that exposes only execution/serialization methods (no
pipe/append methods), and ensure any code that constructs MongoOutStage and
MongoMergeStage remains unchanged but now returns the terminal builder to
prevent further chaining.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yml
Review profile: CHILL
Plan: Pro
Run ID: 812b82e4-d2b2-4fc5-8afc-a15bd550efd2
⛔ Files ignored due to path filters (2)
projects/orm-consolidation/specs/remaining-stages-pipeline-style-updates.spec.mdis excluded by!projects/**projects/orm-consolidation/specs/reviews/code-review.mdis excluded by!projects/**
📒 Files selected for processing (4)
packages/2-mongo-family/5-query-builders/pipeline-builder/src/builder.tspackages/2-mongo-family/5-query-builders/pipeline-builder/test/builder.test-d.tspackages/2-mongo-family/7-runtime/test/runtime-types.test-d.tspackages/2-mongo-family/7-runtime/vitest.config.ts
028bae7 to
f92372a
Compare
There was a problem hiding this comment.
Actionable comments posted: 4
🧹 Nitpick comments (2)
examples/mongo-demo/src/server.ts (1)
130-141:getDashboardfacet pipeline loses type safety — consider documenting this tradeoff.Using raw stage classes (
MongoCountStage,MongoSortStage,MongoLimitStage) insidefacet()is necessary since the builder doesn't support nested builders, but it loses the type-safe DSL benefits. A brief comment explaining this tradeoff would help readers understand the pattern.📝 Suggested documentation
export async function getDashboard(pipeline: Db['pipeline'], runtime: Db['runtime']) { + // facet() requires raw stage instances since nested builders aren't supported. + // Each facet sub-pipeline runs independently on the input documents. const plan = pipeline .from('posts') .facet({🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@examples/mongo-demo/src/server.ts` around lines 130 - 141, Add a short inline comment in the getDashboard function explaining that using raw stage classes (MongoCountStage, MongoSortStage, MongoLimitStage) inside pipeline.facet() is intentional because the builder currently doesn't support nested builders, and that this choice sacrifices some DSL type-safety in favor of expressiveness; reference getDashboard, pipeline.facet, and the specific stage classes so maintainers understand the tradeoff and rationale.packages/2-mongo-family/5-query-builders/pipeline-builder/src/builder.ts (1)
292-308:bucket()andbucketAuto()widen toDocShape— consider documenting this behavior.These methods correctly widen to
DocShapesince the output shape depends on the opaqueoutputaccumulators. This is appropriate, but a brief JSDoc comment would help users understand why type information is lost.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/2-mongo-family/5-query-builders/pipeline-builder/src/builder.ts` around lines 292 - 308, Add a brief JSDoc above the bucket and bucketAuto methods explaining that they widen the pipeline's output type to DocShape because the output fields are determined by opaque accumulator specifications; reference the methods bucket() and bucketAuto(), the fact they call `#withStage`<DocShape>(new MongoBucketStage(options)) and `#withStage`<DocShape>(new MongoBucketAutoStage(options)), and note that the output?: Record<string, MongoAggAccumulator> makes precise typing impossible so consumers should expect a widened DocShape result.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@packages/2-mongo-family/4-query/query-ast/src/stages.ts`:
- Around line 510-548: The MongoBucketAutoStage constructor must validate that
the provided buckets value is a positive integer; add a check in the
MongoBucketAutoStage constructor (before assigning this.buckets and calling
this.freeze()) that ensures options.buckets is a number,
Number.isInteger(options.buckets) and options.buckets > 0, and throw a
descriptive Error (as MongoLimitStage does) if the check fails so invalid bucket
counts are rejected (note: rewrite creates a new MongoBucketAutoStage(opts) so
the constructor validation covers rewritten instances too).
- Around line 470-508: The MongoBucketStage constructor must validate that the
boundaries array has at least two entries; update the MongoBucketStage
constructor to check options.boundaries.length >= 2 and throw a clear error
(e.g., throw new Error(...)) if it is shorter, keeping the existing freeze of
this.boundaries and other assignments; ensure the same invariant is preserved
when rewrite() constructs a new MongoBucketStage (no extra change needed there
beyond the constructor validation).
- Around line 940-974: Add runtime validation in MongoVectorSearchStage's
constructor to ensure numCandidates >= limit and throw a clear Error when the
constraint is violated; locate the constructor in the MongoVectorSearchStage
class and after assigning this.numCandidates and this.limit, validate that
this.numCandidates >= this.limit (or that options.numCandidates >=
options.limit) and throw an Error describing the requirement (e.g.,
"numCandidates must be >= limit") so invalid stage objects cannot be
constructed.
In `@packages/3-mongo-target/2-mongo-adapter/src/lowering.ts`:
- Around line 175-181: lowerWindowField currently casts the output of
lowerAggExpr(wf.operator) to an object but lowerAggExpr can return primitives;
add a defensive runtime check in lowerWindowField to ensure the returned value
is a plain non-null object (and not an array) before spreading it, and if it is
not, throw a clear error referencing wf.operator (or otherwise fail fast) so we
don't silently produce invalid window field shapes; keep the existing logic that
adds result['window'] = { ...wf.window } when wf.window exists.
---
Nitpick comments:
In `@examples/mongo-demo/src/server.ts`:
- Around line 130-141: Add a short inline comment in the getDashboard function
explaining that using raw stage classes (MongoCountStage, MongoSortStage,
MongoLimitStage) inside pipeline.facet() is intentional because the builder
currently doesn't support nested builders, and that this choice sacrifices some
DSL type-safety in favor of expressiveness; reference getDashboard,
pipeline.facet, and the specific stage classes so maintainers understand the
tradeoff and rationale.
In `@packages/2-mongo-family/5-query-builders/pipeline-builder/src/builder.ts`:
- Around line 292-308: Add a brief JSDoc above the bucket and bucketAuto methods
explaining that they widen the pipeline's output type to DocShape because the
output fields are determined by opaque accumulator specifications; reference the
methods bucket() and bucketAuto(), the fact they call `#withStage`<DocShape>(new
MongoBucketStage(options)) and `#withStage`<DocShape>(new
MongoBucketAutoStage(options)), and note that the output?: Record<string,
MongoAggAccumulator> makes precise typing impossible so consumers should expect
a widened DocShape result.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yml
Review profile: CHILL
Plan: Pro
Run ID: bfb7e636-4207-433c-b526-8c38799781c7
⛔ Files ignored due to path filters (3)
projects/orm-consolidation/plans/tml-2212-remaining-stages-plan.mdis excluded by!projects/**projects/orm-consolidation/specs/remaining-stages-pipeline-style-updates.spec.mdis excluded by!projects/**projects/orm-consolidation/specs/reviews/code-review.mdis excluded by!projects/**
📒 Files selected for processing (23)
examples/mongo-demo/package.jsonexamples/mongo-demo/src/db.tsexamples/mongo-demo/src/server.tspackages/2-mongo-family/4-query/query-ast/src/commands.tspackages/2-mongo-family/4-query/query-ast/src/exports/index.tspackages/2-mongo-family/4-query/query-ast/src/stages.tspackages/2-mongo-family/4-query/query-ast/src/visitors.tspackages/2-mongo-family/4-query/query-ast/test/stages-extended.test.tspackages/2-mongo-family/4-query/query-ast/test/stages.test-d.tspackages/2-mongo-family/5-query-builders/orm/src/compile.tspackages/2-mongo-family/5-query-builders/orm/test/collection.test.tspackages/2-mongo-family/5-query-builders/orm/test/compile.test.tspackages/2-mongo-family/5-query-builders/pipeline-builder/src/builder.tspackages/2-mongo-family/5-query-builders/pipeline-builder/test/builder.test-d.tspackages/2-mongo-family/5-query-builders/pipeline-builder/test/builder.test.tspackages/2-mongo-family/7-runtime/test/runtime-types.test-d.tspackages/2-mongo-family/7-runtime/vitest.config.tspackages/3-mongo-target/2-mongo-adapter/src/lowering.tspackages/3-mongo-target/2-mongo-adapter/src/mongo-adapter.tspackages/3-mongo-target/2-mongo-adapter/test/lowering.test.tspackages/3-mongo-target/2-mongo-adapter/test/mongo-adapter.test.tstest/integration/package.jsontest/integration/test/mongo/pipeline-builder.test.ts
✅ Files skipped from review due to trivial changes (7)
- test/integration/package.json
- examples/mongo-demo/package.json
- packages/2-mongo-family/7-runtime/vitest.config.ts
- packages/2-mongo-family/5-query-builders/orm/src/compile.ts
- packages/2-mongo-family/5-query-builders/orm/test/compile.test.ts
- packages/2-mongo-family/5-query-builders/pipeline-builder/test/builder.test.ts
- packages/3-mongo-target/2-mongo-adapter/test/lowering.test.ts
🚧 Files skipped from review as they are similar to previous changes (10)
- packages/2-mongo-family/5-query-builders/orm/test/collection.test.ts
- examples/mongo-demo/src/db.ts
- packages/2-mongo-family/7-runtime/test/runtime-types.test-d.ts
- packages/2-mongo-family/4-query/query-ast/test/stages.test-d.ts
- packages/3-mongo-target/2-mongo-adapter/src/mongo-adapter.ts
- packages/3-mongo-target/2-mongo-adapter/test/mongo-adapter.test.ts
- packages/2-mongo-family/4-query/query-ast/src/visitors.ts
- packages/2-mongo-family/4-query/query-ast/src/exports/index.ts
- test/integration/test/mongo/pipeline-builder.test.ts
- packages/2-mongo-family/4-query/query-ast/test/stages-extended.test.ts
…line-style updates, and cleanup Covers: complete pipeline stage AST (14 new stages), rename MongoReadStage to MongoPipelineStage, eliminate AggregatePipelineEntry, pipeline-style update support for write commands, and builder methods for all new stages.
The union type name now reflects that the pipeline includes non-read stages ($out, $merge, $count). Pure rename with no behavior change.
… and tests Adds MongoOutStage, MongoUnionWithStage, MongoBucketStage, MongoBucketAutoStage, MongoGeoNearStage, MongoFacetStage, MongoGraphLookupStage, MongoMergeStage, MongoSetWindowFieldsStage, MongoDensifyStage, MongoFillStage, MongoSearchStage, MongoSearchMetaStage, and MongoVectorSearchStage. Each stage includes constructor, visitor dispatch, rewrite support, lowering to wire format, runtime and type-level tests, and exports. MongoPipelineStage union now covers all 28 MongoDB aggregation stages.
AggregateCommand.pipeline now accepts only MongoPipelineStage[], removing the Record<string, unknown> arm. The isTypedStage() guard and raw pass-through in lowerPipeline() are no longer needed. Use RawAggregateCommand for raw/untyped pipelines.
Introduces MongoUpdateSpec (union of Record<string, MongoValue> and MongoUpdatePipelineStage[]) for UpdateOneCommand, UpdateManyCommand, and FindOneAndUpdateCommand. The adapter lowers pipeline-style updates through lowerStage() for each stage in the array.
Adds fluent methods to PipelineBuilder for redact, out, merge, unionWith, bucket, bucketAuto, geoNear, facet, graphLookup, setWindowFields, densify, fill, search, searchMeta, and vectorSearch. Each method constructs the corresponding AST stage and returns a new builder.
…stages Address code review findings F01-F03, F05: - F01: test output accumulator rewriting for bucket/bucketAuto - F02: test restrictSearchWithMatch rewriting for graphLookup - F03: verify rewritten query content for geoNear (not just defined) - F05: test rewrite with object into for merge stage
19 tests exercising a representative sample of pipeline stages (match, sort, group, addFields, project, count, unwind, lookup, facet, bucket, unionWith, sortByCount, sample, out, merge) plus complex multi-stage pipelines — all via the mongoPipeline DSL executed against MongoMemoryReplSet.
Four new pipeline endpoints demonstrate the type-safe aggregation DSL end-to-end: author leaderboard (group+sort+lookup), recent post summaries (sort+limit+addFields+project), posts with authors (lookup+sort), and a dashboard facet (count+recentPosts+byAuthor).
The type test only checks MongoQueryPlan (default Row=unknown), never asserting that build() carries specific row types or that execute() returns typed results. All row fields resolve to unknown at the consumption site.
Every pipeline transformation/operation/expression must have a type-level test proving resolved row types are correct — not just shape keys. This is non-negotiable: the existing tests only checked intermediate DocShape, not the final Row type users get from execute(), and the entire type flow was silently broken.
Verify that build() produces MongoQueryPlan<Row> with concrete field types for every pipeline transformation: from, match, sort, limit/skip/sample, addFields, project (inclusion + computed), group, unwind, count, sortByCount, lookup, replaceRoot, pipe, and chained pipelines. Also test nullable field resolution and execute() type inference. Fix sortByCount() to capture the field type parameter so _id resolves to the correct type instead of unknown.
…sult Verify that MongoRuntime.execute(plan) returns AsyncIterableResult<Row> where Row matches the resolved row type from PipelineBuilder.build(). Enable typecheck in the runtime vitest config for .test-d.ts files.
Array.isArray does not narrow ReadonlyArray out of a union in the false branch in TS 5.9. Use an explicit type guard to ensure the Record<string, MongoValue> branch is properly narrowed.
The value-objects PR changed ContractField to nest codecId under
type: { kind, codecId }. Update the test contract to match so
ExtractCodecId resolves concrete types instead of falling back
to unknown. Also use PlanRow helper for more robust type
comparisons through dist declarations.
…rch stages Validate boundaries length >= 2 for $bucket, buckets as positive integer for $bucketAuto, and limit/numCandidates constraints for $vectorSearch, consistent with existing MongoLimitStage pattern.
Replace unsafe cast with spread + runtime check to ensure window field operators lower to objects as expected.
…pe through search Move MongoUpdatePipelineStage definition to stages.ts (alongside MongoPipelineStage) and use it for MongoMergeStage.whenMatched. Also keep Shape through search() and vectorSearch() since they return documents from the same collection.
Extract new-stage builder method tests into builder-new-stages.test.ts, keeping core builder behavior tests in the original file (307 lines).
Use toArray() instead of direct Promise cast for AsyncIterableResult in integration tests. Add lowering tests for $geoNear optional fields and $setWindowFields error branch to meet branch coverage threshold.
94b2e57 to
4fcca4d
Compare
Type tests (builder.test-d.ts): prove row type preservation for redact, out, merge, unionWith, densify, fill, search, vectorSearch, and row type reset for bucket, bucketAuto, facet, geoNear, graphLookup, setWindowFields, searchMeta. Integration tests (pipeline-builder.test.ts): verify bucketAuto, redact, graphLookup (self-join), and setWindowFields (running total) execute correctly against a real MongoDB instance.
closes TML-2212
Walkthrough — Remaining Stages, Pipeline-Style Updates & Cleanup
Key snippet
Before
After
Intent
Make every MongoDB aggregation pipeline stage a first-class typed AST node — visitable, rewritable, and lowerable — so that the
Record<string, unknown>escape hatch can be eliminated. Extend write commands to accept pipeline-style (aggregation-based) updates for computed writes. Add builder methods so users can construct the new stages ergonomically. Prove end-to-end type safety with rigorous type-level tests for every pipeline operation, and verify real-world behavior with integration tests against MongoDB.Change map
The story
Rename
MongoReadStage→MongoPipelineStage. The union included stages like$count,$sample, and$redactthat aren't read-only. With output stages ($out,$merge) coming, the name mismatch would worsen. Pure rename, no behavior change.Add 14 new pipeline stage AST classes. Each follows the established
MongoStageNodepattern: readonly fields,kinddiscriminant,accept()for visitor dispatch,rewrite()for expression transformation, andfreeze()in the constructor. Stages withMongoAggExprfields recurse during rewrite. Stages withMongoFilterExprrecurse through the filter rewriter. Stages with nested pipelines recurse into each sub-pipeline stage. Atlas stages treat config as opaque and returnthisfromrewrite().Extend the visitor and lowering.
MongoStageVisitor<R>grows to 28 required methods.lowerStage()gains 14 new switch cases, each producing the corresponding MongoDB wire document.Eliminate the
AggregatePipelineEntryescape hatch. With all stages typed,AggregateCommand.pipelinebecomesReadonlyArray<MongoPipelineStage>. TheisTypedStage()guard and the raw-object pass-through are removed.lowerPipeline()collapses tostages.map(lowerStage).Add pipeline-style update support.
MongoUpdatePipelineStage = MongoAddFieldsStage | MongoProjectStage | MongoReplaceRootStage.MongoUpdateSpec = Record<string, MongoValue> | ReadonlyArray<MongoUpdatePipelineStage>. Update commands widen their.updatefield toMongoUpdateSpec. The adapter detects array vs object and lowers accordingly.Add builder methods.
PipelineBuildergains 15 new methods (redact,out,merge,unionWith,bucket,bucketAuto,geoNear,facet,graphLookup,setWindowFields,densify,fill,search,searchMeta,vectorSearch).Cover untested rewrite paths. Added rewrite tests for bucket/bucketAuto
outputaccumulators, graphLookuprestrictSearchWithMatch, geoNear query content verification, and merge with objectinto.Prove end-to-end type safety. Added rigorous type-level tests (
builder.test-d.ts,runtime-types.test-d.ts) that verify every pipeline operation resolves to concrete row types —from(),match(),sort(),limit(),skip(),sample(),addFields(),project()(inclusion and computed),group()(with all accumulator types),unwind(),count(),sortByCount(),lookup(),replaceRoot(),pipe(), nullable fields, chained pipelines, andruntime.execute()type flow.Integration tests against real MongoDB. 19 tests exercising the pipeline builder DSL against MongoMemoryReplSet, covering match, sort, group, addFields, project, count, unwind, lookup, facet, bucket, unionWith, sortByCount, sample, out, merge, and complex multi-stage chains.
Demo app pipeline queries. Four new endpoints in the mongo demo app demonstrating real-world aggregation DSL usage: leaderboard, recent posts, posts with authors, and a faceted dashboard.
Behavior changes & evidence
Every pipeline stage is a typed, visitable, rewritable, lowerable AST node: Before, 14 of ~28 stages fell through
Record<string, unknown>— no compile-time validation, no visitor dispatch, no expression rewriting, no lowering. After, all 28 stages are first-class AST nodes with exhaustive visitor and lowering coverage.MongoFilterExprorMongoAggExprwould silently skip rewriting and lowering — a correctness bug waiting to happen as usage expands.Untyped escape hatch eliminated:
AggregateCommand.pipelinewasReadonlyArray<MongoReadStage | Record<string, unknown>>— raw objects bypassed all AST infrastructure. Now it'sReadonlyArray<MongoPipelineStage>— raw objects are a compile error.@ts-expect-erroron raw objectWrite commands accept pipeline-style updates:
UpdateOneCommand.updatewasRecord<string, MongoValue>— only traditional operator documents. Now it'sMongoUpdateSpec— either traditional or a pipeline array.Builder has typed methods for all major stages: Adds 15 new builder methods to
PipelineBuilder.Pipeline DSL produces typed results end-to-end: Every pipeline operation (
from,match,sort,limit,skip,sample,addFields,project,group,unwind,count,sortByCount,lookup,replaceRoot,pipe) resolves to concrete row types throughResolveRow.runtime.execute(plan)returnsAsyncIterableResult<Row>matching the plan's row type.DocShapekeys were tested — the final resolved row types were never verified and silently resolved tounknown.build()return typeexecute<Row>(plan)Pipeline DSL works end-to-end against real MongoDB: 19 integration tests prove that pipelines constructed via the DSL execute correctly against a real MongoDB instance, covering 15 stage types across all major categories.
Demo app demonstrates pipeline DSL: Four new pipeline endpoints in the mongo demo app showing real-world aggregation usage (leaderboard, recent posts, posts with authors, dashboard facet).
Compatibility / migration / risk
MongoReadStage→MongoPipelineStage. Any external code referencing the old name will get a compile error. Acceptable since this is a prototype without external consumers.UpdateOneCommand,UpdateManyCommand, andFindOneAndUpdateCommandnow acceptMongoUpdateSpecinstead ofRecord<string, MongoValue>. Existing ORM code continues to work unchanged.AggregatePipelineEntrytype is removed from the public API.Follow-ups / open questions
$setWindowFieldstype-level shape tracking. Window functions add computed fields. The initial builder method uses opaqueDocShape. Precise tracking deferred.$facettype-level shape tracking. Each facet sub-pipeline has an independent output shape. The initial builder accepts typed sub-pipelines but uses opaque output.$search,$searchMeta, and$vectorSearchuseRecord<string, unknown>for config. Atlas Search operators form a complex, independently-versioned query language.Non-goals / intentionally out of scope
computeUpdate()/computeUpdateOne()terminal methods — infrastructure lands here, ergonomic builder API follows.$facet,$setWindowFields).fn/acchelpers — covered by TML-2217.Summary by CodeRabbit
New Features
Tests
Examples