Skip to content

Conversation

@hvarfner
Copy link
Contributor

Required changes in GPytorch to unblock meta-pytorch/botorch#3080. When load_state_dict is called with assign=True, setattr is called on _transformed attributes of the prior at the pytorch level.

This was not the intended use of the _transformed attribute, but it seems like we have to enable its modification directly.

@hvarfner
Copy link
Contributor Author

@Balandat

@hvarfner
Copy link
Contributor Author

@SebastianAment

# Prefix for buffered attributes in TransformedDistributions.
# These are copies of the base distribution attributes, enabling state_dict
# save/load since the original attributes are properties and cannot be bufferized.
BUFFERED_PREFIX = "_buffered_"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would changing this from _transformed_ to __buffered__ cause issues with backward compatibility of loading a state dict?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if this would be a big deal, it's probably not super widely used so I think we could merge this even if this breaks BC.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it would unfortunately.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants