You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey!
Thanks for beeing so responsive.
I have another question again...
in the dual branch,
FC score is a fully connect that ends with a RELU and fc_weight branch ends with a sigmoid. Am I correct that it makes score q between 0 and + infinity.
Is the network supposed to output normalized scores?
The text was updated successfully, but these errors were encountered:
Hi! It is a good question! In different perceptual scale, it has different quality score. For easily training the network, we normalize the all dataset score to 0-1. The output should between 0-1. If you want to obtain the original score, you can wrtie a reversed function of normalization.
Thanks a lot!
I am trying to use your network as a backbone, using koniq - pretrained.
I am doing a pairwise finetuning (siamese network scale) . where my data is paired images, and using ranking margin loss.
The think is I see the network output values between -5 and 5... so I was wondering how could be.
Hi, actually, from the code of the dual branch, even though we normalize the label score, it could be that the output is greater than 1.
I think this is due to fc_score branch, that ends with a RELU meaning fc_score value could be greater than 1.
if fc_score is greater than 1, then the dual branch output value _s = torch.sum(f * w) / torch.sum(w), would also be greater than 1.
Am I correct?
Hey!
Thanks for beeing so responsive.
I have another question again...
in the dual branch,
FC score is a fully connect that ends with a RELU and fc_weight branch ends with a sigmoid. Am I correct that it makes score q between 0 and + infinity.
Is the network supposed to output normalized scores?
The text was updated successfully, but these errors were encountered: