Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

wrapspawner cannot find jupyterhub-single on host but batchspawner can #36

Closed
gmfricke opened this issue Jan 6, 2020 · 2 comments
Closed

Comments

@gmfricke
Copy link

gmfricke commented Jan 6, 2020

Hello, thanks for implementing the interface. It looks like it could be extremely useful for our HPC cluster. We have multiple queues and this would seem to be just the thing.

Unfortunately, the jupyterhub-single instance on the compute nodes can't be found when I use wrapspawner but it works fine with batchspawner only. I would be happy to upload parts of my config files if that would be useful. I checked on the compute node and the server is running waiting for a connection.

When wrapspawner is enabled I get:

[snip]
Job 109152.xena.xena.alliance.unm.edu still pending
[D 2020-01-06 16:16:48.588 JupyterHub batchspawner:214] Spawner querying job: sudo -E -u mfricke qstat -x 109152.xena.xena.alliance.unm.edu
[I 2020-01-06 16:16:48.632 JupyterHub batchspawner:330] Notebook server job 109152.xena.xena.alliance.unm.edu started at xena22:58735
[D 2020-01-06 16:16:48.641 JupyterHub spawner:851] Polling subprocess every 30s
[D 2020-01-06 16:16:53.574 JupyterHub batchspawner:214] Spawner querying job: sudo -E -u mfricke qstat -x 109152.xena.xena.alliance.unm.edu
[W 2020-01-06 16:16:53.616 JupyterHub base:744] User mfricke is slow to become responsive (timeout=10)
[D 2020-01-06 16:16:53.617 JupyterHub base:746] Expecting server for mfricke at: http://xena22:58735/user/mfricke/
[I 2020-01-06 16:16:53.631 JupyterHub log:158] 302 POST /hub/spawn -> /user/mfricke/ (mfricke@::ffff:129.24.246.13) 10067.56ms
[I 2020-01-06 16:16:53.646 JupyterHub log:158] 302 GET /user/mfricke/ -> /hub/user/mfricke/ (@::ffff:129.24.246.13) 0.86ms
[D 2020-01-06 16:16:53.663 JupyterHub base:1018] Waiting for mfricke pending spawn
[I 2020-01-06 16:16:58.306 JupyterHub log:158] 200 GET /hub/api (@172.16.1.72) 1.34ms
[D 2020-01-06 16:17:00.941 JupyterHub proxy:686] Proxy: Fetching GET http://127.0.0.1:8001/api/routes
16:17:00.944 [ConfigProxy] info: 200 GET /api/routes
[I 2020-01-06 16:17:00.944 JupyterHub proxy:301] Checking routes
[I 2020-01-06 16:17:03.664 JupyterHub base:1022] Pending spawn for mfricke didn't finish in 10.0 seconds
[I 2020-01-06 16:17:03.664 JupyterHub base:1028] mfricke is pending spawn
[I 2020-01-06 16:17:03.667 JupyterHub log:158] 200 GET /hub/user/mfricke/ (mfricke@::ffff:129.24.246.13) 10010.10ms
[D 2020-01-06 16:17:18.643 JupyterHub batchspawner:214] Spawner querying job: sudo -E -u mfricke qstat -x 109152.xena.xena.alliance.unm.edu
[D 2020-01-06 16:17:48.643 JupyterHub batchspawner:214] Spawner querying job: sudo -E -u mfricke qstat -x 109152.xena.xena.alliance.unm.edu
[D 2020-01-06 16:18:18.643 JupyterHub batchspawner:214] Spawner querying job: sudo -E -u mfricke qstat -x 109152.xena.xena.alliance.unm.edu
[D 2020-01-06 16:18:48.644 JupyterHub batchspawner:214] Spawner querying job: sudo -E -u mfricke qstat -x 109152.xena.xena.alliance.unm.edu
[W 2020-01-06 16:18:49.127 JupyterHub user:510] mfricke's server never showed up at http://xena22:58735/user/mfricke/ after 120 seconds. Giving up
[D 2020-01-06 16:18:49.128 JupyterHub batchspawner:214] Spawner querying job: sudo -E -u mfricke qstat -x 109152.xena.xena.alliance.unm.edu
[I 2020-01-06 16:18:49.169 JupyterHub batchspawner:342] Stopping server job 109152.xena.xena.alliance.unm.edu
[I 2020-01-06 16:18:49.170 JupyterHub batchspawner:233] Cancelling job 109152.xena.xena.alliance.unm.edu: sudo -E -u mfricke qdel 109152.xena.xena.alliance.unm.edu
[snip]

but with batchspawner I get

[snip]
Notebook server job 109153.xena.xena.alliance.unm.edu started at xena22:58732
[D 2020-01-06 16:21:04.345 JupyterHub spawner:851] Polling subprocess every 30s
[D 2020-01-06 16:21:07.914 JupyterHub batchspawner:214] Spawner querying job: sudo -E -u mfricke qstat -x 109153.xena.xena.alliance.unm.edu
[W 2020-01-06 16:21:07.960 JupyterHub base:744] User mfricke is slow to become responsive (timeout=10)
[D 2020-01-06 16:21:07.960 JupyterHub base:746] Expecting server for mfricke at: http://xena22:58732/user/mfricke/
[I 2020-01-06 16:21:07.960 JupyterHub base:1066] mfricke is pending spawn
[I 2020-01-06 16:21:08.072 JupyterHub log:158] 200 GET /hub/user/mfricke/ (mfricke@::ffff:129.24.246.13) 10178.01ms
[D 2020-01-06 16:21:08.327 JupyterHub log:158] 200 GET /favicon.ico (@::ffff:129.24.246.13) 2.35ms
[I 2020-01-06 16:21:16.409 JupyterHub log:158] 200 GET /hub/api (@172.16.1.72) 1.09ms
[D 2020-01-06 16:21:20.957 JupyterHub utils:188] Server at http://xena22:58732/user/mfricke/ responded with 302
[D 2020-01-06 16:21:20.957 JupyterHub _version:53] jupyterhub and jupyterhub-singleuser both on version 0.9.6
[I 2020-01-06 16:21:20.957 JupyterHub base:638] User mfricke took 23.047 seconds to start
[I 2020-01-06 16:21:20.958 JupyterHub proxy:242] Adding user mfricke to proxy /user/mfricke/ => http://xena22:58732
[D 2020-01-06 16:21:20.958 JupyterHub proxy:686] Proxy: Fetching POST http://127.0.0.1:8001/api/routes/user/mfricke
16:21:20.959 [ConfigProxy] info: Adding route /user/mfricke -> http://xena22:58732
16:21:20.960 [ConfigProxy] info: 201 POST /api/routes/user/mfricke
[I 2020-01-06 16:21:20.961 JupyterHub users:533] Server mfricke is ready
[I 2020-01-06 16:21:20.961 JupyterHub log:158] 200 GET /hub/api/users/mfricke/server/progress (mfricke@::ffff:129.24.246.13) 12656.60ms
[D 2020-01-06 16:21:22.143 JupyterHub batchspawner:214] Spawner querying job: sudo -E -u mfricke qstat -x 109153.xena.xena.alliance.unm.edu
[snip]

Thanks for any help or advice you can provide.

All the best,

Matthew

@hoba87
Copy link

hoba87 commented Jan 7, 2020

I think I have had the same issue and have found a workaround as described in #35 (comment).

@gmfricke
Copy link
Author

gmfricke commented Jan 8, 2020

I think I have had the same issue and have found a workaround as described in #35 (comment).

That worked for me. Thank you! Not closing yet in case the comment is helpful to others with the same issue.

gmfricke added a commit to gmfricke/batchspawner that referenced this issue Jan 8, 2020
Fixed missing variable name. The addition of this line is based on jupyterhub/wrapspawner#36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants