-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
wavelet_center not padding enough #39
Comments
Thanks for your report. I am currently pretty busy and traveling next week, but I will have a Luis |
The correct way to do it with the current code is to pass a My current thinking is to require an explicit Here's the code that gives better results:
|
Thanks for your reply. So if a border must be manually set, what is the advantage of wavelet_center over numpy's pad function? |
|
Is there some theory that suggests the amount of zero-padding required for perfect inversion based on signal and filter sizes? I guess what I'm saying is that if you have to fiddle with a border size parameter ( where did 24 come from? ) to get correct results, the additional step that |
The filter size determines that. I couldn't remember off the top of my head the size of Daubechies 20, so I rounded up to 24. I think what this discussion is showing is that |
Sounds like a fix. IMHO, it would be ideal if these methods were abstracted away somehow so that one would work only in terms of transformations and inverse transformations. |
Yes, you're probably right. |
I'm not sure if the daubechies transforms are working correctly. I was under the impression that while there are issues for images of size not corresponding to a power of 2, the wavelet_center function serves to pad zeros to reach a power of 2. It is also my understanding that performing a transform / inversion with no operations in between should yield (near) perfect reconstruction of the image. I was not experiencing this apart from the example with luispedro image.
So I decided to run some tests by creating random images of size 1, ..., 512 and testing the transform/inversion for each daubechies type. I used a pixel-wise mean absolute difference for each image size and wavelet type to measure the reconstruction performance. In particular, I ran the following short bit of code:
Plotting stats is telling:
So what's going on here? I fiddled with the wavelet_center and determined that it "rounds" only to the next highest power of 2. So things like 509, 510, 511 go to 512.
Now so if I run something like (where R is a random 510 x 510 matrix):
whereas:
So it looks like the wavelet_center function should pad relative to the current image size, not just round up. I'm not sure about odd size either.
For sanity, I ran the same earlier test with pywt wavelets library (which was MUCH slower being pure python). This yielded satisfactory reconstruction error across the boards (notice the scaling factor on the y-axis):
The text was updated successfully, but these errors were encountered: