Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Review of codebase and docs - Probabilities and Encodings- Datseris #213

Merged
merged 36 commits into from
Dec 25, 2022
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
edf5da6
update probabilities table
Datseris Dec 22, 2022
b8a416d
CountOccurrences works with `Any` input
Datseris Dec 22, 2022
789c5d4
better terminology header
Datseris Dec 22, 2022
d6a327a
simpler headers in probabilities
Datseris Dec 22, 2022
fac8c8a
Add encodings page
Datseris Dec 22, 2022
5dfbc98
simplify SymbolicPermutation docstring
Datseris Dec 22, 2022
d57c4e6
reference complexity measures
Datseris Dec 22, 2022
7150a6e
correct dosctring to reference isrand
Datseris Dec 22, 2022
fb9f13f
more organized tests for symbolic permutat
Datseris Dec 22, 2022
6815cad
full rewrite of `SymbolicPermutation` and proper `encode` for Ordinal.
Datseris Dec 22, 2022
e8d5ade
type optimization in making the embedding
Datseris Dec 22, 2022
bb9e450
remove entropy!
Datseris Dec 22, 2022
90376c4
simplifi probabilities! even more
Datseris Dec 22, 2022
2d1f455
move fasthist to encoding folder
Datseris Dec 22, 2022
b2572bd
complete unification of symbolic perm methods
Datseris Dec 22, 2022
995fcca
docstring for weighted fversion
Datseris Dec 23, 2022
1da7c87
add docstring to amplkitude aware
Datseris Dec 23, 2022
9f56b1d
delete ALL other files
Datseris Dec 23, 2022
48143e6
fix all symbolic permutation tests
Datseris Dec 23, 2022
3dea113
fix all permutation tests (and one file only)
Datseris Dec 23, 2022
8d75e24
clarify source code of encode Gaussian
Datseris Dec 23, 2022
de128a1
better docstring for GaussEncod
Datseris Dec 23, 2022
5acdd86
simplify docstring of Dispersion
Datseris Dec 23, 2022
20867aa
more tests for naivekernel
Datseris Dec 23, 2022
772af38
Zhu -> Correa
Datseris Dec 23, 2022
c5a5cb8
shorter docstring for spatial permutation
Datseris Dec 23, 2022
17dd4e6
port spatial permutation example to Examples
Datseris Dec 23, 2022
04265cb
re-write SpatialSymb to have encoding as field. All tests pass.
Datseris Dec 24, 2022
69fadb7
better display of exampels in decode
Datseris Dec 24, 2022
17f9a17
better doc for ordinal encoding
Datseris Dec 24, 2022
a1c9c65
Some typos/nitpickery
kahaaga Dec 25, 2022
e78231a
Probabilities can't compute.
kahaaga Dec 25, 2022
3c54910
Don't duplicate `SpatialDispersion`
kahaaga Dec 25, 2022
da35469
Clarify docstrings a bit
kahaaga Dec 25, 2022
99131b7
Typo
kahaaga Dec 25, 2022
7f77993
Cross-reference spatial estimators
kahaaga Dec 25, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@
CairoMakie = "13f3f980-e62b-5c42-98c6-ff1f3baf88f0"
ChaosTools = "608a59af-f2a3-5ad4-90b4-758bdf3122a7"
CoordinateTransformations = "150eb455-5306-5404-9cee-2592286d6298"
DelayEmbeddings = "5732040d-69e3-5649-938a-b6b4f237613f"
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
DocumenterTools = "35a29f4d-8980-5a13-9543-d66fff28ecb8"
Expand Down
3 changes: 2 additions & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@ cd(@__DIR__)
using Pkg
CI = get(ENV, "CI", nothing) == "true" || get(ENV, "GITHUB_TOKEN", nothing) !== nothing
using Entropies
using DelayEmbeddings
using Documenter
using DocumenterTools: Themes
using CairoMakie
using Entropies.DelayEmbeddings
import Entropies.Wavelets

# %% JuliaDynamics theme
Expand Down Expand Up @@ -35,6 +35,7 @@ ENV["JULIA_DEBUG"] = "Documenter"
ENTROPIES_PAGES = [
"index.md",
"probabilities.md",
"encodings.md",
"entropies.md",
"complexity.md",
"multiscale.md",
Expand Down
2 changes: 1 addition & 1 deletion docs/src/devdocs.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ Good practices in developing a code base apply in every Pull Request. The [Good
5. If suitable, the estimator may be able to operate based on [`Encoding`]s. If so, it is preferred to implement an `Encoding` subtype and extend the methods [`encode`](@ref) and [`decode`](@ref). This will allow your probabilities estimator to be used with a larger span of entropy and complexity methods without additional effort.
6. Implement dispatch for [`probabilities_and_outcomes`](@ref) and your probabilities estimator type.
7. Implement dispatch for [`outcome_space`](@ref) and your probabilities estimator type.
8. Add your probabilities estimator type to the list in the docstring of [`ProbabilitiyEstimator`](@ref), and if you also made an encoding, add it to the [`Encoding`](@ref) docstring.
8. Add your probabilities estimator type to the table list in the documentation page of probabilities. If you made an encoding, also add it to corresponding table in the encodings section.

### Optional steps
You may extend any of the following functions if there are potential performance benefits in doing so:
Expand Down
20 changes: 20 additions & 0 deletions docs/src/encodings.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Encodings

## Encoding API

Some probability estimators first "encode" input data into an intermediate representation indexed by the positive integers. This intermediate representation is called an "encoding" and its API is defined by the following:

```@docs
Encoding
encode
decode
```

## Available encodings

```@docs
OrdinalPatternEncoding
GaussianCDFEncoding
RectangularBinEncoding
```

33 changes: 18 additions & 15 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,23 +8,22 @@ Entropies
You are reading the development version of the documentation of Entropies.jl,
that will become version 2.0.

## API & terminology
## Terminology

!!! note
The documentation here follows (loosely) chapter 5 of
[Nonlinear Dynamics](https://link.springer.com/book/10.1007/978-3-030-91032-7),
Datseris & Parlitz, Springer 2022.

In the literature, the term "entropy" is used (and abused) in multiple contexts.
The API and documentation of Entropies.jl aim to clarify some aspects of its usage, and
to provide a simple way to obtain probabilities, entropies, or other complexity measures.
The API and documentation of Entropies.jl aim to clarify some aspects of its usage, and to provide a simple way to obtain probabilities, entropies, or other complexity measures.

### Probabilities

Entropies and other complexity measures are typically computed based on _probability distributions_.
These are obtained from [Input data for Entropies.jl](@ref) in a plethora of different ways.
The central API function that returns a probability distribution (in fact, just a vector of probabilities) is [`probabilities`](@ref), which takes in a subtype of [`ProbabilitiesEstimator`](@ref) to specify how the probabilities are computed.
All estimators available in Entropies.jl can be found in the [estimators page](@ref probabilities_estimators).
These can be obtained from input data in a plethora of different ways.
The central API function that returns a probability distribution (or more precisely a probability mass function) is [`probabilities`](@ref), which takes in a subtype of [`ProbabilitiesEstimator`](@ref) to specify how the probabilities are computed.
All available estimators can be found in the [estimators page](@ref probabilities_estimators).

### Entropies

Expand All @@ -40,24 +39,28 @@ Thus, any of the implemented [probabilities estimators](@ref probabilities_estim

These names are commonplace, and so in Entropies.jl we provide convenience functions like [`entropy_wavelet`](@ref). However, it should be noted that these functions really aren't anything more than 2-lines-of-code wrappers that call [`entropy`](@ref) with the appropriate [`ProbabilitiesEstimator`](@ref).

In addition to `ProbabilitiesEstimators`, we also provide [`EntropyEstimator`](@ref)s,
which compute entropies via alternate means, without explicitly computing some
In addition to `ProbabilitiesEstimators`, we also provide [`EntropyEstimator`](@ref)s,
which compute entropies via alternate means, without explicitly computing some
probability distribution. Differential/continuous entropy, for example, is computed
using a dedicated [`EntropyEstimator`](@ref). For example, the [`Kraskov`](@ref)
estimator computes Shannon differential entropy via a nearest neighbor algorithm, while
using a dedicated [`EntropyEstimator`](@ref). For example, the [`Kraskov`](@ref)
estimator computes Shannon differential entropy via a nearest neighbor algorithm, while
the [`Zhu`](@ref) estimator computes Shannon differential entropy using order statistics.
Datseris marked this conversation as resolved.
Show resolved Hide resolved

### Other complexity measures

Other complexity measures, which strictly speaking don't compute entropies, and may or may
not explicitly compute probability distributions, are found in
[Complexity.jl](https://github.com/JuliaDynamics/Complexity.jl) package. This includes
measures like sample entropy and approximate entropy.
Other complexity measures, which strictly speaking don't compute entropies, and may or may not explicitly compute probability distributions, are found in
[Complexity measures](@ref) page.
This includes measures like sample entropy and approximate entropy.

## [Input data for Entropies.jl](@id input_data)

The input data type typically depend on the probability estimator chosen. In general though, the standard DynamicalSystems.jl approach is taken and as such we have three types of input data:
The input data type typically depend on the probability estimator chosen.
In general though, the standard DynamicalSystems.jl approach is taken and as such we have three types of input data:

- _Timeseries_, which are `AbstractVector{<:Real}`, used in e.g. with [`WaveletOverlap`](@ref).
- _Multi-dimensional timeseries, or datasets, or state space sets_, which are [`Dataset`](@ref), used e.g. with [`NaiveKernel`](@ref).
- _Spatial data_, which are higher dimensional standard `Array`s, used e.g. with [`SpatialSymbolicPermutation`](@ref).

```@docs
Dataset
```
40 changes: 26 additions & 14 deletions docs/src/probabilities.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# [Probabilities](@id probabilities_estimators)
# Probabilities

## Probabilities API

Expand All @@ -8,51 +8,63 @@ The probabilities API is defined by
- [`probabilities`](@ref)
- [`probabilities_and_outcomes`](@ref)

and related functions that you will find in the following documentation blocks:

### Probabilitities

```@docs
ProbabilitiesEstimator
probabilities
probabilities!
Probabilities
```

### Outcomes

```@docs
probabilities_and_outcomes
outcomes
outcome_space
total_outcomes
missing_outcomes
```

## Overview
## [Overview of probabilities estimators](@id probabilities_estimators)

Any of the following estimators can be used with [`probabilities`](@ref).
Any of the following estimators can be used with [`probabilities`](@ref)
(in the column "input data" it is assumed that the `eltype` of the input is `<: Real`).

| Estimator | Principle | Input data |
| ------------------------------------------- | --------------------------- | ------------------- |
| [`CountOccurrences`](@ref) | Frequencies | `Vector`, `Dataset` |
|:--------------------------------------------|:----------------------------|:--------------------|
| [`CountOccurrences`](@ref) | Count of unique elements | `Any` |
| [`ValueHistogram`](@ref) | Binning (histogram) | `Vector`, `Dataset` |
| [`TransferOperator`](@ref) | Binning (transfer operator) | `Vector`, `Dataset` |
| [`NaiveKernel`](@ref) | Kernel density estimation | `Dataset` |
| [`SymbolicPermutation`](@ref) | Ordinal patterns | `Vector` |
| [`SymbolicWeightedPermutation`](@ref) | Ordinal patterns | `Vector` |
| [`SymbolicAmplitudeAwarePermutation`](@ref) | Ordinal patterns | `Vector` |
| [`SymbolicPermutation`](@ref) | Ordinal patterns | `Vector`, `Dataset` |
| [`SymbolicWeightedPermutation`](@ref) | Ordinal patterns | `Vector`, `Dataset` |
| [`SymbolicAmplitudeAwarePermutation`](@ref) | Ordinal patterns | `Vector`, `Dataset` |
| [`SpatialSymbolicPermutation`](@ref) | Ordinal patterns in space | `Array` |
| [`Dispersion`](@ref) | Dispersion patterns | `Vector` |
| [`SpatialDispersion`](@ref) | Dispersion patterns in space | `Array` |
| [`Diversity`](@ref) | Cosine similarity | `Vector` |
| [`WaveletOverlap`](@ref) | Wavelet transform | `Vector` |
| [`PowerSpectrum`](@ref) | Fourier spectra | `Vector`, `Dataset` |
| [`PowerSpectrum`](@ref) | Fourier transform | `Vector` |
Datseris marked this conversation as resolved.
Show resolved Hide resolved

## Count occurrences (counting)
## Count occurrences

```@docs
CountOccurrences
```

## Visitation frequency (histograms)
## Histograms

```@docs
ValueHistogram
RectangularBinning
FixedRectangularBinning
```

## Permutation (symbolic)
## Symbolic permutations

```@docs
SymbolicPermutation
Expand All @@ -61,14 +73,14 @@ SymbolicAmplitudeAwarePermutation
SpatialSymbolicPermutation
```

## Dispersion (symbolic)
## Dispersion patterns

```@docs
Dispersion
SpatialDispersion
```

## Transfer operator (binning)
## Transfer operator

```@docs
TransferOperator
Expand Down
4 changes: 2 additions & 2 deletions src/Entropies.jl
Original file line number Diff line number Diff line change
Expand Up @@ -24,10 +24,10 @@ include("complexity.jl")
include("multiscale.jl")

# Library implementations (files include other files)
include("encoding/all_encodings.jl") # other structs depend on these
include("probabilities_estimators/probabilities_estimators.jl")
include("entropies/entropies.jl")
include("encoding/all_encodings.jl")
include("complexity/complexity_measures.jl") # relies on encodings, so include after
include("complexity/complexity_measures.jl")
include("deprecations.jl")


Expand Down
1 change: 1 addition & 0 deletions src/encoding/all_encodings.jl
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
include("rectangular_binning.jl")
include("gaussian_cdf.jl")
include("ordinal_pattern.jl")
82 changes: 35 additions & 47 deletions src/encoding/ordinal_pattern.jl
Original file line number Diff line number Diff line change
@@ -1,74 +1,63 @@
using StaticArrays: MVector
using StateSpaceSets: AbstractDataset
using Combinatorics: permutations

export OrdinalPatternEncoding
#TODO: The docstring here, and probably the source code, needs a full re-write
# based on new `encode` interface.

"""
OrdinalPatternEncoding <: Encoding
OrdinalPatternEncoding(; m::Int, lt = est.lt)
OrdinalPatternEncoding(m::Int, lt = Entropies.isless_rand)

An encoding scheme that [`encode`](@ref)s `m`-dimensional permutation/ordinal patterns to
integers and [`decode`](@ref)s these integers to permutation patterns based on the Lehmer
code.

## Usage

Used in [`outcomes`](@ref) with probabilities estimators such as
[`SymbolicPermutation`](@ref) to map state vectors `xᵢ ∈ x` into their
integer symbol representations `πᵢ`.
code. It is used by [`SymbolicPermutation`](@ref) and similar estimators, see that for
a description of the outcome space.

## Description

The Lehmer code, as implemented here, is a bijection between the set of `factorial(n)`
possible permutations for a length-`n` sequence, and the integers `1, 2, …, n`.

- *Encoding* converts an `m`-dimensional permutation pattern `p` into its unique integer
symbol ``\\omega \\in \\{0, 1, \\ldots, m - 1 \\}``, using Algorithm 1 in Berger
et al. (2019)[^Berger2019].
- *Decoding* converts an ``\\omega_i \\in \\Omega` ` to its original permutation pattern.

`OrdinalPatternEncoding` is thus meant to be applied on a *permutation*, i.e.
a vector of indices that would sort some vector in ascending order (in practice: the
result of calling `sortperm(x)` for some input vector `x`).
The Lehmer code, as implemented here, is a bijection between the set of `factorial(m)`
possible permutations for a length-`m` sequence, and the integers `1, 2, …, factorial(m)`.
The encoding step uses algorithm 1 in Berger et al. (2019)[^Berger2019], which is
highly optimized.
The decoding step is much slower due to missing optimizations (pull requests welcomed!).

## Example

```jldoctest
julia> using Entropies

julia> x = [1.2, 5.4, 2.2, 1.1]; encoding = OrdinalPatternEncoding(m = length(x));

julia> xs = sortperm(x)
4-element Vector{Int64}:
4
1
3
2
julia> x = Dataset(rand(100, 3));

julia> s = encode(encoding, xs)
20
julia> c = OrdinalPatternEncoding(3);

julia> decode(encoding, s)
4-element SVector{4, Int64} with indices SOneTo(4):
4
1
3
2
julia> encode(c, x[1])
1
```

[^Berger2019]:
Berger, Sebastian, et al. "Teaching Ordinal Patterns to a Computer: Efficient
Encoding Algorithms Based on the Lehmer Code." Entropy 21.10 (2019): 1023.
"""
Base.@kwdef struct OrdinalPatternEncoding{M <: Integer} <: Encoding
m::M = 3
lt::Function = isless_rand
struct OrdinalPatternEncoding{M, F} <: Encoding
perm::MVector{M, Int}
lt::F
end
function OrdinalPatternEncoding(m = 3, lt::F = isless_rand) where {F}
return OrdinalPatternEncoding{m, F}(zeros(MVector{m, Int}), lt)
end

# So that SymbolicPerm stuff fallback here
total_outcomes(::OrdinalPatternEncoding{m}) where {m} = factorial(m)
outcome_space(::OrdinalPatternEncoding{m}) where {m} = permutations(1:m) |> collect

function encode(encoding::OrdinalPatternEncoding, perm)
m = encoding.m

# Notice that `χ` is an element of a `Dataset`, so most definitely a static vector in
# our code. However we allow `AbstractVector` if a user wanna use `encode` directly
function encode(encoding::OrdinalPatternEncoding{m}, χ::AbstractVector) where {m}
if m != length(χ)
throw(ArgumentError("Permutation order and length of vector must match!"))
end
perm = sortperm!(encoding.perm, χ; lt = encoding.lt)
# Begin Lehmer code
n = 0
for i = 1:m-1
for j = i+1:m
Expand All @@ -77,18 +66,17 @@ function encode(encoding::OrdinalPatternEncoding, perm)
n = (m-i)*n
end
# The Lehmer code actually results in 0 being an encoded symbol. Shift by 1, so that
# encodings are positive integers.
# encodings are the positive integers.
return n + 1
end

# I couldn't find any efficient algorithm in the literature for converting
# between factorial number system representations and Lehmer codes, so we'll just have to
# use this naive approach for now. It is probably possible to do this in a faster way.
function decode(encoding::OrdinalPatternEncoding, s::Int)
m = encoding.m
function decode(::OrdinalPatternEncoding{m}, s::Int) where {m}
# Convert integer to its factorial number representation. Each factorial number
# corresponds to a unique permutation of the numbers `1, 2, ..., m`.
f = base10_to_factorial(s - 1, m) # subtract 1 because we add 1 in `encode`
f::SVector{ndigits, Int} = base10_to_factorial(s - 1, m) # subtract 1 because we add 1 in `encode`

# Reconstruct the permutation from the factorial representation
xs = 1:m |> collect
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -187,6 +187,7 @@ end
# low level histogram call
##################################################################
# This method is called by `probabilities(est::ValueHistogram, x::Array_or_Dataset)`
# `fasthist!` is in the `estimators/value_histogram` folder.
"""
fasthist(c::RectangularBinEncoding, x::Vector_or_Dataset)
Intermediate method that runs `fasthist!` in the encoded space
Expand Down
Loading