Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question on Soft argmax implementations. #76

Open
widiba03304 opened this issue Jan 6, 2024 · 0 comments
Open

Question on Soft argmax implementations. #76

widiba03304 opened this issue Jan 6, 2024 · 0 comments

Comments

@widiba03304
Copy link

widiba03304 commented Jan 6, 2024

AFAIK, soft argmax is an expectation of heatmap, which should be calculated without control flows like if or for.
However, the code contains some control flows for tensorflow and pytorch impl.

Can you check if my understanding is correct?

  1. Let's assume an input with a shape [B, D, J, H, W].
  2. The softmax to the activation reduces the axes [D, H, W], which results in the sum of [D, H, W] is 1.
  3. However, these implementations multiply indices (linspace) for [D, H], [D, W], and [H, W], not [D, H, W].

Why is the softmax performed for [D, H, W] if the indices for [D, H], [D, W], [H, W] should be multiplied by it?

Thanks in advance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant