Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

虚拟机断电重启后出现内部集群无法访问问题rbd-worker on 10.43.0.10:53: read udp 10.42.0.12:37857->10.43.0.10:53: i/o timeout\"", #2136

Open
qingshi688 opened this issue Jan 19, 2025 · 3 comments
Assignees

Comments

@qingshi688
Copy link

qingshi688 commented Jan 19, 2025

Describe the bug
虚拟机断电重启后出现内部集群无法访问问题rbd-worker on 10.43.0.10:53: read udp 10.42.0.12:37857->10.43.0.10:53: i/o timeout"",

To Reproduce
Steps to reproduce the behavior:
d536f4dff2dff8/regions/rainbond/officialplugins接口返回数据如下:
{
"code": 500,
"msg": "get rbd plugins: get app statuses: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing dial tcp: lookup rbd-worker on 10.43.0.10:53: read udp 10.42.0.12:37857->10.43.0.10:53: i/o timeout"",
"msg_show": "get rbd plugins: get app statuses: rpc error: code = Unavailable desc = connection error: desc = "transport: Error while dialing dial tcp: lookup rbd-worker on 10.43.0.10:53: read udp 10.42.0.12:37857->10.43.0.10:53: i/o timeout"",
"data": {
"bean": {},
"list": []
}
}
Screenshots

Image

Image

The relevant information:

  • Rainbond Version [e.g. 6.1.0]
@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Title: After the virtual machine is powered off and restarted, the internal cluster cannot be accessed. rbd-worker on 10.43.0.10:53: read udp 10.42.0.12:37857->10.43.0.10:53: i/o timeout" ",

@zzzhangqi
Copy link
Collaborator

@qingshi688 从报错看起来是你 k8s 集群的 Coredns 出了问题,请检查你的 k8s 集群

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


@qingshi688 From the error report, it seems that there is a problem with Coredns of your k8s cluster. Please check your k8s cluster.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants