Skip to content

Conversation

@brendanhasz
Copy link
Owner

PyTorch started enforcing a check that the output of a traced function matches the output of the same function eagerly-executed. This will almost always fail in probflow applications, because most model's outputs will be different between two calls, because of the random posterior sampling.

To fix, we'll just set check_trace=False in torch.jit.trace_module.

@brendanhasz brendanhasz self-assigned this Aug 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants