Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

features.add_feature_edge( lambda y_, y: 0 ) #3

Open
goog opened this issue Aug 15, 2012 · 12 comments
Open

features.add_feature_edge( lambda y_, y: 0 ) #3

goog opened this issue Aug 15, 2012 · 12 comments

Comments

@goog
Copy link

goog commented Aug 15, 2012

Hey, shuyo! the 351 line in crf.py :features.add_feature_edge( lambda y_, y: 0 ) , is there a problem? THX!!!

@shuyo
Copy link
Owner

shuyo commented Aug 16, 2012

I cannot remember why I wrote the line ... I guess it is because I did for a sample skeleton, maybe ... :D

@goog
Copy link
Author

goog commented Aug 16, 2012

BTW, which algorithm did you apply to calculate the gradient_likelihood function?

@shuyo
Copy link
Owner

shuyo commented Aug 16, 2012

No. I reckon it needs no specific algorithm to calculate the gradient of likelihood on CRF because it has a closed form.

@goog
Copy link
Author

goog commented Aug 16, 2012

Hey! could you say it more detailed ,and I'm not similar with it. Thanks!

@goog
Copy link
Author

goog commented Aug 16, 2012

I fixed it like this:

    pre_y =''
    for j,label in enumerate(features.labels):  #label for a line
        for token in tokens:
            features.add_feature( lambda x, y, l=label, t=token: 1 if y==l and x[0]==t else 0 )
            # def self.features.append(f) .
        for info in infos:
            features.add_feature( lambda x, y, l=label, i=info: 1 if y==l and x[1]==i else 0 )
        if j:
            features.add_feature_edge( lambda y_, y,p=pre_y,l=label: 1 if y_==p and y==l else 0 )
        pre_y=label 

@goog
Copy link
Author

goog commented Aug 16, 2012

Hey, pal! there is a little problem in function _calc_fmlist which didnt take positions into consideration

@shuyo
Copy link
Owner

shuyo commented Aug 17, 2012

Sutton's CRF tutorial introduced the gradient of the likelihood.
http://people.cs.umass.edu/~mccallum/papers/crf-tutorial.pdf

@goog
Copy link
Author

goog commented Aug 17, 2012

Thanks! BTW, which difference(違い) is used by The gradient is computation?

@shuyo
Copy link
Owner

shuyo commented Aug 17, 2012

Like what problem?

@goog
Copy link
Author

goog commented Aug 17, 2012

is the gradient computed using differences ?

@shuyo
Copy link
Owner

shuyo commented Aug 17, 2012

What are "differences"?
If you means Numerical Differentiation, then no.

@goog
Copy link
Author

goog commented Aug 17, 2012

Hi,shuyo! I have some questions
1,In the partial derivatives of (1.21) in Sutton's CRF tutorial, why there is a probability? It's difficult for me to to understand.
2, how to store models or models mergence for several different train datasets
3, when 'fmlist doesn't depend on x_i' , is it like HMM and which scenario can this case be applied in.
4, the subscripts of logbetas in gradient_likelihood seems inconsistent with my book
I wanna fully understand your code and if possible add more detailed comments on the code for people who like me (a newbie for CRF) please! Thanks in advance!!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants