Skip to content

Conversation

@taoleicn
Copy link
Contributor

No description provided.

@taoleicn taoleicn requested a review from jinpoon March 17, 2021 01:21
normalize_after: bool, optional
if True, apply post layer normalization; otherwise apply pre layer normalization
(default=True).

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would you mind adding the doc string of the right_window also? Just wanted to make sure the legal range of this parameter.

state["saved_query"] = query_not_ready
state["saved_key"] = k
state["saved_value"] = v
incremental_state["attn_state"] = state
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When there are no ready queries, would it be more efficient to just return the empty tensor here?

input: Tensor,
mask_pad: Optional[Tensor] = None,
attn_mask: Optional[Tensor] = None,
incremental_state: Optional[Dict[str, Dict[str, Optional[Tensor]]]] = None,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you change data structure of incremental_state to Optional[Dict[str, Optional[Tensor]]? Seems the SRUCell does use the incremental_state and I should write some glue codes inheriting from the fair_seq's incremental class, and assign & maintain a layer specific hash as the key to this dictionary, in stead of attn_state

c0: Optional[Tensor] = None,
mask_pad: Optional[Tensor] = None,
attn_mask: Optional[Tensor] = None,
incremental_state: Optional[Dict[str, Dict[str, Optional[Tensor]]]] = None,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same^

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants