Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Review for v2 - Entropies/Complexities/Multiscale - Datseris #220

Merged
merged 36 commits into from
Dec 26, 2022

Conversation

Datseris
Copy link
Member

Closes #219 . Currently wip.

Changes done:

  • Rename Entropy to EntropyDefinition.
  • Rename EntropyEstimator to DifferentialEntropyEstimator.

@codecov
Copy link

codecov bot commented Dec 25, 2022

Codecov Report

Merging #220 (6e273fc) into main (cbb3319) will decrease coverage by 0.04%.
The diff coverage is 75.75%.

❗ Current head 6e273fc differs from pull request most recent head 3545738. Consider uploading reports for the commit 3545738 to get more accurate results

@@            Coverage Diff             @@
##             main     #220      +/-   ##
==========================================
- Coverage   84.47%   84.43%   -0.05%     
==========================================
  Files          45       45              
  Lines        1140     1137       -3     
==========================================
- Hits          963      960       -3     
  Misses        177      177              
Impacted Files Coverage Δ
...e_dispersion_entropy/reverse_dispersion_entropy.jl 100.00% <ø> (ø)
src/entropies/convenience_definitions.jl 100.00% <ø> (ø)
src/entropies/curado.jl 100.00% <ø> (ø)
...estimators/nearest_neighbors/KozachenkoLeonenko.jl 100.00% <ø> (ø)
.../entropies/estimators/nearest_neighbors/Kraskov.jl 100.00% <ø> (ø)
src/entropies/estimators/nearest_neighbors/Zhu.jl 100.00% <ø> (ø)
...entropies/estimators/nearest_neighbors/ZhuSingh.jl 100.00% <ø> (ø)
...ies/estimators/order_statistics/AlizadehArghami.jl 100.00% <ø> (ø)
...rc/entropies/estimators/order_statistics/Correa.jl 100.00% <ø> (ø)
.../entropies/estimators/order_statistics/Ebrahimi.jl 100.00% <ø> (ø)
... and 18 more

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@kahaaga
Copy link
Member

kahaaga commented Dec 25, 2022

The analytical tests here (barely) fail due to #214. We can fix that in a separate PR.

@Datseris
Copy link
Member Author

What's the maximum entropy for:

  1. Kaniadakis
  2. Curado
  3. Stretchd expnential

? It's not mentione in the docstrings

@kahaaga
Copy link
Member

kahaaga commented Dec 26, 2022

What's the maximum entropy for Kaniadakis

There's an open issue (#182) on defining the maximum entropy for the Kaniadakis entropy. I haven't found anything in the literature on it yet, although I haven't searched very thoroughly. It might also be the case that nobody has defined it in the literature yet. It would be a fun exercise to figure it out, but I don't have time for that at the moment, so it just means that entropy_normalized doesn't work for Kaniadakis for now.

What's the maximum entropy for Curado

For total outcomes L, the maximum for Curado is L * (1 - exp(-b/L)) + exp(-b) - 1

Stretchd expnential

The maximum is defined in the source code. It involves various gamma functions, so better just look there.

@Datseris
Copy link
Member Author

Keyword metric is not used anywhere in ApproxiamteEntropy as far as I can tell. Should we remove it?

@kahaaga
Copy link
Member

kahaaga commented Dec 26, 2022

Keyword metric is not used anywhere in ApproxiamteEntropy as far as I can tell. Should we remove it?

It should be used inside compute_ϕ. That said, I don't know if we should document metric in neither SampleEntropy nor ApproximateEntropy, because they both explicitly use the maximum norm. The field metric is just there because, when implementing them, I thought the methods were generalizable to other metrics. Algorithmically speaking any metric-generalization is possible, but I don't know if theoretically speaking, it makes any sense to use other metrics.

So to not confuse anyone, let's just keep metric as an internal implementation detail, and just say in the docstrings that the Chebyshev metric is used. That is, both for SampleEntropy and ApproximateEntropy.

@Datseris
Copy link
Member Author

I would say that we shouldn't have metric. They use Chebyshev, we use Chebyshev. Overgeneralization is a problem when it serves no purpose.

@kahaaga
Copy link
Member

kahaaga commented Dec 26, 2022

They use Chebyshev, we use Chebyshev. Overgeneralization is a problem when it serves no purpose.

Sure, that's also fine.

@Datseris Datseris changed the title Review for v2 - Entropies - Datseris Review for v2 - Entropies/Complexities/Multiscale - Datseris Dec 26, 2022
@Datseris Datseris marked this pull request as ready for review December 26, 2022 21:39
@Datseris
Copy link
Member Author

this is good to go. many examples dont' work at the moment btu this is for a different pr that fixes the examples

@kahaaga
Copy link
Member

kahaaga commented Dec 26, 2022

this is good to go. many examples dont' work at the moment btu this is for a different pr that fixes the examples

Ok, then I'll review this quickly and merge.

Will you fix the examples, or should I do it?

EDIT: I'll at least fix the tests while reviewing this. Maybe I'll do the examples too if it's nothing major.
EDIT 2: I fixed the examples too. It wasn't too much work.

@kahaaga
Copy link
Member

kahaaga commented Dec 26, 2022

@Datseris Ok, I think I'm done here now. I did

  • Fix some typos, cross-references and missing/broken links in the documentation.
  • Remove the not-yet-part-of-the-public-API complexity example from the docs. It's still there, but in a separate file in the docs/src/dev folder.
  • Thoroughly review the documentation text for the main page, and for the "Entropies" section. I'm happy to say that I have no more issues with it, nor for the naming of methods/types etc. It is short, concise and clear.
  • Fix all broken tests related to the changes you did here. Mostly, that just involved calling internal methods instead of the public methods (yes, I know we should only test the public API, but these tests will be part of the public API, so I just adjusted them here, so that CI passes until we get the multiscale API settled.)
  • Fix broken documentation examples.

I'll merge when CI passes.

@kahaaga
Copy link
Member

kahaaga commented Dec 26, 2022

The tests and docs passed. The last commit is just a typo for the docs.

@kahaaga kahaaga merged commit 8ff6f7d into main Dec 26, 2022
@kahaaga kahaaga deleted the datsers_review_entropies branch December 26, 2022 23:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Remove method entropy([e::Entropy,] est::EntropyEstimator, x)
2 participants