You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(torchenv) D:\KG\BERT-BILSTM-CRF-main\BERT-BILSTM-CRF-main>python main.py
['O', 'B-故障设备', 'I-故障设备', 'B-故障原因', 'I-故障原因']
{'O': 0, 'B-故障设备': 1, 'I-故障设备': 2, 'B-故障原因': 3, 'I-故障原因': 4}
Some weights of the model checkpoint at ./model_hub/chinese-bert-wwm-ext/ were not used when initializing BertModel: ['cls.predictions.transform.dense.bias', 'cls.seq_relationship.bias',
'cls.predictions.decoder.weight', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.weight', 'cls.seq_relationship.weight', 'cls.predictions.bias', 'cls.predi
ctions.transform.LayerNorm.bias']
This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification
model from a BertForPreTraining model).
This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a
BertForSequenceClassification model).
C:\SoftCXY\Anaconda\envs\torch3.8\lib\site-packages\torch\nn\modules\rnn.py:62: UserWarning: dropout option adds dropout after all but last recurrent layer, so non-zero dropout expects nu
m_layers greater than 1, but got dropout=0.1 and num_layers=1
warnings.warn("dropout option adds dropout after all but last "
D:\KG\torchenv\lib\site-packages\transformers\optimization.py:391: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch imple
mentation torch.optim.AdamW instead, or set no_deprecation_warning=True to disable this warning
warnings.warn(
main.py:52: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.t
ensor(sourceTensor).
labels = torch.tensor(labels, dtype=torch.long)
Could not load symbol cublasGetSmCountTarget from cublas64_11.dll. Error code 127
The text was updated successfully, but these errors were encountered:
大佬,运行时出现警告,BERT模型权重未使用。请问怎么解决?
(torchenv) D:\KG\BERT-BILSTM-CRF-main\BERT-BILSTM-CRF-main>python main.py
['O', 'B-故障设备', 'I-故障设备', 'B-故障原因', 'I-故障原因']
{'O': 0, 'B-故障设备': 1, 'I-故障设备': 2, 'B-故障原因': 3, 'I-故障原因': 4}
Some weights of the model checkpoint at ./model_hub/chinese-bert-wwm-ext/ were not used when initializing BertModel: ['cls.predictions.transform.dense.bias', 'cls.seq_relationship.bias',
'cls.predictions.decoder.weight', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.weight', 'cls.seq_relationship.weight', 'cls.predictions.bias', 'cls.predi
ctions.transform.LayerNorm.bias']
model from a BertForPreTraining model).
BertForSequenceClassification model).
C:\SoftCXY\Anaconda\envs\torch3.8\lib\site-packages\torch\nn\modules\rnn.py:62: UserWarning: dropout option adds dropout after all but last recurrent layer, so non-zero dropout expects nu
m_layers greater than 1, but got dropout=0.1 and num_layers=1
warnings.warn("dropout option adds dropout after all but last "
D:\KG\torchenv\lib\site-packages\transformers\optimization.py:391: FutureWarning: This implementation of AdamW is deprecated and will be removed in a future version. Use the PyTorch imple
mentation torch.optim.AdamW instead, or set
no_deprecation_warning=True
to disable this warningwarnings.warn(
main.py:52: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.t
ensor(sourceTensor).
labels = torch.tensor(labels, dtype=torch.long)
Could not load symbol cublasGetSmCountTarget from cublas64_11.dll. Error code 127
The text was updated successfully, but these errors were encountered: