You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have material with typewritten forms that is very challenging (to any binarization method), because the typewriter sometimes fades out, while the printing ink near it blasts in a dark black. The scan/photography also seems to cause a non-normalized histogram:
original
default-2021-03-09
(after contrast normalization)
(after +20% brightness)
(after -30% brightness)
Olena with Wolf's algorithm
So it seems that the autoencoder gets confused by the normalized image, but benefits from making the image even darker. May that be a general tendency (as in: if you loose fg, make it darker, and conversely if you get bg, make it brighter)? Can we derive any metrics that might hint at quality problems from the intermediate activation between encoder and decoder? Any recommendations/considerations?
The text was updated successfully, but these errors were encountered:
I have material with typewritten forms that is very challenging (to any binarization method), because the typewriter sometimes fades out, while the printing ink near it blasts in a dark black. The scan/photography also seems to cause a non-normalized histogram:
So it seems that the autoencoder gets confused by the normalized image, but benefits from making the image even darker. May that be a general tendency (as in: if you loose fg, make it darker, and conversely if you get bg, make it brighter)? Can we derive any metrics that might hint at quality problems from the intermediate activation between encoder and decoder? Any recommendations/considerations?
The text was updated successfully, but these errors were encountered: