Skip to content

Conversation

@Rahuldrabit
Copy link

1. Core Implementation

File: src/frontends/pytorch/src/op/cross_entropy_loss.cpp

Implements the translate_cross_entropy_loss function that converts PyTorch's cross_entropy_loss operation to OpenVINO operations. The implementation follows PyTorch's specification:

cross_entropy_loss = log_softmax + nll_loss

Key features:

  • Applies LogSoftmax along dimension 1 (class dimension)
  • Uses one-hot masking plus ReduceSum to select target class probabilities (NLL loss)
  • Supports optional class weights via Gather + Multiply
  • Supports three reduction modes:
    • none (0): No reduction, returns per-sample loss
    • mean (1): Mean of all losses (default)
    • sum (2): Sum of all losses
  • Handles both 2D inputs (N, C) and higher-dimensional inputs (N, C, d1, d2, ...)

2. Operation Registration

File: src/frontends/pytorch/src/op_table.cpp

  • Added forward declaration: OP_CONVERTER(translate_cross_entropy_loss);
  • Registered operation: {"aten::cross_entropy_loss", op::translate_cross_entropy_loss},

3. Tests

File: tests/layer_tests/pytorch_tests/test_cross_entropy_loss.py

Comprehensive test suite covering:

  • Basic functionality with different reduction modes (mean, sum, none)
  • Class weights
  • Different input shapes (2D and spatial dimensions)
  • Edge cases (binary classification, single sample, many classes)

@Rahuldrabit Rahuldrabit requested a review from a team as a code owner October 26, 2025 14:45
@github-actions github-actions bot added the category: PyTorch FE OpenVINO PyTorch Frontend label Oct 26, 2025
@sys-openvino-ci sys-openvino-ci added the ExternalPR External contributor label Oct 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

category: PyTorch FE OpenVINO PyTorch Frontend ExternalPR External contributor

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants