Replies: 1 comment 1 reply
-
Hi @Dzhou12
Does any of this help? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
when I tried to learn a one-dimensional Ising model with nearest neighbor sysy coupling (with RBM and the default initial parameters) , I met the problem that my cost function converges to a local minimum. When I replace the sysy with sxsx or szsz coupling (the basis vectors to represent these operators as matrices are sz eigenvecors) , cost functions are able to converges to the right minimum. Since physically there is not a particular direction, I think it's a problem of initial condition of the machine. Since I have the need to use sysy (say in the spin compass model) , I want to know how to overcome this problem, my code is as follows:
from netket.graph import Chain
g=Chain(10)
hi=nk.hilbert.Spin(s=1/2,N=g.n_nodes)
N=g.n_nodes
sxsx=np.array([[0,0,0,1],[0,0,1,0],[0,1,0,0],[1,0,0,0]])
sysy=np.array([[0,0,0,-1],[0,0,1,0],[0,1,0,0],[-1,0,0,0]])
szsz=np.array([[1,0,0,0],[0,-1,0,0],[0,0,-1,0],[0,0,0,1]])
ha=nk.operator.GraphOperator(hilbert=hi,graph=g,bond_ops=[-sysy])
ma=nk.models.RBM(alpha=1)
sa=nk.sampler.MetropolisLocal(hilbert=hi)
op=nk.optimizer.Adam(learning_rate=0.1)
sr=nk.optimizer.SR(diag_shift=0.01)
vstate = nk.vqs.MCState(sampler=sa, model=ma, n_samples=1008)
gs=nk.driver.VMC(ha,op,sr=sr,variational_state=vstate)
gs.run(n_iter=3000,out='out')
and the result is
which converge to -1, not the exact energy -10
besides, how to put up codes efficiently? (I find my codes squeezed together when I only use one "Add code" in the "write", so I put my codes line by line with a "Add code" for each line)
For a learning rate of 0.01, and diag_shift of 0.08, a the cost function will converge to a local minimum about -6.4, that's the best output I can get after some trial. I also tried to set a larger learning rate after the convergence, but it doesn't work.
Beta Was this translation helpful? Give feedback.
All reactions