Skip to content

Standardize parameter typing to float | torch.Tensor when appropriate #325

@orionarcher

Description

@orionarcher

Throughout the API we often take parameters like kT, dt, and alpha. In some cases, these are typed as float | torch.Tensor, in others they are just torch.Tensor. We should:

  1. standardize towards float | torch.Tensor in init and step functions.
  2. Use x = torch.as_tensor(x) instead of the if statement often used

Metadata

Metadata

Assignees

No one assigned

    Labels

    docsImprovements or additions to documentationenhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions