Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

模型缺失权重问题 #10

Open
lvmenglong opened this issue May 28, 2023 · 7 comments
Open

模型缺失权重问题 #10

lvmenglong opened this issue May 28, 2023 · 7 comments

Comments

@lvmenglong
Copy link

在layers.py中GraphConvolution中的_call()函数中:
for i in range(len(self.support)): if 'weights_'+str(i) in self.vars: if not self.featureless: pre_sup = dot(x, self.vars['weights_' + str(i)], sparse=self.sparse_inputs) else: pre_sup = self.vars['weights_' + str(i)] else: pre_sup = x support = dot(self.support[i], pre_sup, sparse=True) supports.append(support)
能解释一下为什么通过if条件执行pre_sup = x,通过运行发现该处操作导致模型第二层GCN没有权重,希望您能尽快回复解释一下这里的用意,感谢!

@1049451037
Copy link
Owner

__init__函数里可以看到,当input_dim == output_dim and not self.transform的时候是没有变换权重的,如果想要加上权重,可以在models.py文件里把第二层的transform=True

@lvmenglong
Copy link
Author

__init__函数里可以看到,当input_dim == output_dim and not self.transform的时候是没有变换权重的,如果想要加上权重,可以在models.py文件里把第二层的transform=True

好的,感谢

@lvmenglong
Copy link
Author

__init__函数里可以看到,当input_dim == output_dim and not self.transform的时候是没有变换权重的,如果想要加上权重,可以在models.py文件里把第二层的transform=True

您可以提供一下论文中实验结果的实验参数吗?(ae_dim、se_dim、学习率、训练次数)

@1049451037
Copy link
Owner

论文中实验结果当时是使用的ae_dim和se_dim都是1000,这是由于当时使用的weight是正态分布初始化的,导致很多dead neuron,后来改成xavier_init以后只需要几百的dim了。可以参考新的代码:https://github.com/1049451037/HIN-Align

@lvmenglong
Copy link
Author

论文中实验结果当时是使用的ae_dim和se_dim都是1000,这是由于当时使用的weight是正态分布初始化的,导致很多dead neuron,后来改成xavier_init以后只需要几百的dim了。可以参考新的代码:https://github.com/1049451037/HIN-Align

请问这里flags.DEFINE_float('learning_rate', 20, 'Initial learning rate.')的20是表示学习率为0.2吗,本人对tensorflow不太熟悉

@1049451037
Copy link
Owner

learning_rate是20,这个变量在models.py里用到了FLAGS.learning_rate

@lvmenglong
Copy link
Author

好的,非常感谢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants