Added reactor-extension-alloy as a package within alloy#1448
Added reactor-extension-alloy as a package within alloy#1448Spencer-Smith wants to merge 1979 commits intomainfrom
Conversation
…library is selected. (#477)
* Fix data collection 'all settings' functional test * Format
* Remove stylus dependency * Replace the <Alert> component with Spectrum's <InlineAlert> * Remove @spectrum tokens and typography * Give field descriptions the same text color as labels
…e fields (#481) * Add new validation and UI for link callbacks, add clear button to code fields * Fix useEffect return value, Fix init, Change error style for code preview, Only show clear button when it was set
* Added ability to set thirdPartyCookiesEnabled from a data element * Remove console and reuse boolean or data element validation
* Update dependencies * Update non-eslint dev deps * Remove/migrate deprecated .eslintignore file * Run prettier * npm audit fix
…s enabled (#594) * Show advertising settings in the send event view when the component is enabled. * Skip for the moment * Fix the view rendering. * Fix the rendering.
* Update @adobe/alloy to 2.31.0 * Update @adobe/alloy to 2.31.1
Bumps [ajv](https://github.com/ajv-validator/ajv) from 8.17.1 to 8.18.0. - [Release notes](https://github.com/ajv-validator/ajv/releases) - [Commits](ajv-validator/ajv@v8.17.1...v8.18.0) --- updated-dependencies: - dependency-name: ajv dependency-version: 8.18.0 dependency-type: direct:development ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
… scripts+workflows to project root
🦋 Changeset detectedLatest commit: c047c79 The changes in this PR will be included in the next version bump. Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
|
This pull request sets up GitHub code scanning for this repository. Once the scans have completed and the checks have passed, the analysis results for this pull request branch will appear on this overview. Once you merge this pull request, the 'Security' tab will show more code scanning analysis results (for example, for the default branch). Depending on your configuration and choice of analysis tool, future pull requests will be annotated with code scanning analysis results. For more information about GitHub code scanning, check out the documentation. |
| return; | ||
| } | ||
|
|
||
| target[key] = source[key]; |
Check warning
Code scanning / CodeQL
Prototype-polluting function Medium
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 21 hours ago
In general, to fix prototype pollution in deep merge/assign utilities, you must prevent sensitive prototype-related keys (__proto__, constructor, and often prototype) from being written or recursed into, regardless of their values. You can either:
- Block those keys explicitly, or
- Only recurse/assign when the destination already has the property and the key is safe.
The minimal, safest fix here—without changing the public behavior for normal keys—is to add a guard inside deepAssignObject so that any key equal to __proto__, constructor, or prototype is skipped entirely. This avoids ever writing to Object.prototype or other prototypes while leaving normal merging semantics intact. We do not need to touch the isNil and isObject helpers or the exported wrapper function.
Concretely:
- In
deepAssignObject, before checkingisObject(target[key]) && isObject(source[key]), add a condition that returns early for unsafe keys. - Keep using
Object.keys(source), but simplyreturnfrom the callback when the key is unsafe, so nothing is assigned or recursed for that key. - No new imports or external dependencies are required; this is pure JavaScript logic inside the same file.
The change is localized to packages/reactor-extension/src/view/utils/deepAssign.js, lines 16–24, around the Object.keys(source).forEach loop.
| @@ -15,6 +15,11 @@ | ||
|
|
||
| const deepAssignObject = (target, source) => { | ||
| Object.keys(source).forEach((key) => { | ||
| // Prevent prototype pollution by blocking dangerous keys | ||
| if (key === "__proto__" || key === "constructor" || key === "prototype") { | ||
| return; | ||
| } | ||
|
|
||
| if (isObject(target[key]) && isObject(source[key])) { | ||
| deepAssignObject(target[key], source[key]); | ||
| return; |
| return; | ||
| } | ||
|
|
||
| target[key] = source[key]; |
Check warning
Code scanning / CodeQL
Prototype-polluting function Medium test
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI about 21 hours ago
In general, to fix prototype-polluting deep assignment, you must prevent writes to Object.prototype (and other prototypes) via special keys. This is commonly done by skipping properties named __proto__, constructor, and sometimes prototype, or by only recursing into own properties of the destination object. Here, the minimal, behavior-preserving change is to add a guard that omits these dangerous keys from the merge.
The best fix with minimal functional change is to add a conditional inside the Object.keys(source).forEach loop in deepAssignObject that checks the current key, and if it is __proto__, constructor, or prototype, immediately returns from the callback (i.e., skips that key). For all other keys, the behavior remains exactly the same: if both target[key] and source[key] are objects, recurse; otherwise, assign target[key] = source[key]. This change is localized to packages/reactor-extension/test/unit/helpers/deepAssign.js within the body of deepAssignObject and requires no new imports or helper methods.
| @@ -15,6 +15,10 @@ | ||
|
|
||
| const deepAssignObject = (target, source) => { | ||
| Object.keys(source).forEach((key) => { | ||
| if (key === "__proto__" || key === "constructor" || key === "prototype") { | ||
| return; | ||
| } | ||
|
|
||
| if (isObject(target[key]) && isObject(source[key])) { | ||
| deepAssignObject(target[key], source[key]); | ||
| return; |
carterworks
left a comment
There was a problem hiding this comment.
One bump we will run into: .changeset/pre.json says that all repos will be in beta mode at the same time. I'm not sure what will happen if we release the extension without releasing alloy. Will it mess up the beta version numbers? Will it also release alloy? I don't know.
| p.dependencies['@adobe/alloy'] = process.env.ALLOY_NEW; | ||
| fs.writeFileSync('package.json', JSON.stringify(p, null, 2)); | ||
| " | ||
| pnpm version patch --no-commit-hooks |
There was a problem hiding this comment.
What do you think about always doing the same kind of version bump as @adobe/alloy (patch/minor/major) instead of always doing a patch version bump?
| echo "Root version after versioning: ${VERSION}" | ||
|
|
||
| - name: Bump extension when @adobe/alloy was released | ||
| if: hashFiles('packages/reactor-extension/package.json') != '' |
There was a problem hiding this comment.
This conditional (and the copy of it on line 241) will always return true, since you have created that file. hashFiles returns a hash of the file, and the hash of a file that exists will never be ''
… the Alloy Dev workflow
Description
reactor-extension-alloyintopackages/reactor-extension, preserving version history for files@adobe/alloydependency from within the workspaceTODO:
Related Issue
Jira
Motivation and Context
Screenshots (if appropriate):
N/A
Checklist:
pnpm changeset) or it is not necessary because this PR is not consumer-facing.