Skip to content

Job listing can lead to potential BSON too large error #856

@fmigneault

Description

@fmigneault

Describe the bug

If using /jobs?groups=process for example (maybe without groups applies as well?), it is possible that a large MongoDB document causes an error when processing an aggregation pipeline.

One example case of this happening is for a very large workflow containing many log lines with all its nested processes.

There could also be other encoded data in Job.request and Job.response that could impact.

Since these fields are not reported in job listing, it might be possible to do some pre-filtering steps to drop them?

Maybe caused by the following?

"jobs": {"$push": "$$ROOT"}, # matched jobs for corresponding grouping categories

How to Reproduce

  1. Create a Job with >1000 lines of logs (maybe more?)
  2. Call GET /jobs?groups=process to generate an aggregation pipeline for job listing

Expected behavior

no error, listing happens normally

Metadata

Metadata

Assignees

Labels

process/workflowRelated to a Workflow process.triage/bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions