You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
minimize_op = train.minimize(optimizer, Loss)
ERROR: On worker 2:
UndefVarError: py_gradients not defined #35 at /home/alex/.julia/packages/TensorFlow/gwM1d/src/TensorFlow.jl:183 #116 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/process_messages.jl:276
run_work_thunk at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/process_messages.jl:56
run_work_thunk at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/process_messages.jl:65 #102 at ./task.jl:259
Stacktrace:
[1] #remotecall_wait#154(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function, ::Distributed.Worker) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/remotecall.jl:421
[2] remotecall_wait(::Function, ::Distributed.Worker) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/remotecall.jl:412
[3] #remotecall_wait#157(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function, ::Int64) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/remotecall.jl:433
[4] remotecall_wait(::Function, ::Int64) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/remotecall.jl:433
[5] top-level scope at /home/alex/.julia/packages/TensorFlow/gwM1d/src/TensorFlow.jl:182
[6] eval at ./boot.jl:319 [inlined]
[7] eval at ./sysimg.jl:68 [inlined]
[8] add_gradients_py(::Tensor{Float64}, ::Array{Any,1}, ::Nothing) at /home/alex/.julia/packages/TensorFlow/gwM1d/src/core.jl:1548
[9] gradients at /home/alex/.julia/packages/TensorFlow/gwM1d/src/core.jl:1536 [inlined] (repeats 2 times)
[10] compute_gradients(::TensorFlow.train.AdamOptimizer, ::Tensor{Float64}, ::Nothing) at /home/alex/.julia/packages/TensorFlow/gwM1d/src/train.jl:49
[11] #minimize#1(::Nothing, ::Nothing, ::Nothing, ::Function, ::TensorFlow.train.AdamOptimizer, ::Tensor{Float64}) at /home/alex/.julia/packages/TensorFlow/gwM1d/src/train.jl:41
[12] minimize(::TensorFlow.train.AdamOptimizer, ::Tensor{Float64}) at /home/alex/.julia/packages/TensorFlow/gwM1d/src/train.jl:38
[13] top-level scope at none:0
Library Versions
Trying to evaluate ENV["TF_USE_GPU"] but got error: KeyError("TF_USE_GPU")
Trying to evaluate ENV["LIBTENSORFLOW"] but got error: KeyError("LIBTENSORFLOW")
minimize_op = train.minimize(optimizer, Loss)
ERROR: On worker 2:
UndefVarError: py_gradients not defined
#35 at /home/alex/.julia/packages/TensorFlow/gwM1d/src/TensorFlow.jl:183
#116 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/process_messages.jl:276
run_work_thunk at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/process_messages.jl:56
run_work_thunk at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/process_messages.jl:65
#102 at ./task.jl:259
Stacktrace:
[1] #remotecall_wait#154(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function, ::Distributed.Worker) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/remotecall.jl:421
[2] remotecall_wait(::Function, ::Distributed.Worker) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/remotecall.jl:412
[3] #remotecall_wait#157(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function, ::Int64) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/remotecall.jl:433
[4] remotecall_wait(::Function, ::Int64) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Distributed/src/remotecall.jl:433
[5] top-level scope at /home/alex/.julia/packages/TensorFlow/gwM1d/src/TensorFlow.jl:182
[6] eval at ./boot.jl:319 [inlined]
[7] eval at ./sysimg.jl:68 [inlined]
[8] add_gradients_py(::Tensor{Float64}, ::Array{Any,1}, ::Nothing) at /home/alex/.julia/packages/TensorFlow/gwM1d/src/core.jl:1548
[9] gradients at /home/alex/.julia/packages/TensorFlow/gwM1d/src/core.jl:1536 [inlined] (repeats 2 times)
[10] compute_gradients(::TensorFlow.train.AdamOptimizer, ::Tensor{Float64}, ::Nothing) at /home/alex/.julia/packages/TensorFlow/gwM1d/src/train.jl:49
[11] #minimize#1(::Nothing, ::Nothing, ::Nothing, ::Function, ::TensorFlow.train.AdamOptimizer, ::Tensor{Float64}) at /home/alex/.julia/packages/TensorFlow/gwM1d/src/train.jl:41
[12] minimize(::TensorFlow.train.AdamOptimizer, ::Tensor{Float64}) at /home/alex/.julia/packages/TensorFlow/gwM1d/src/train.jl:38
[13] top-level scope at none:0
Library Versions
Trying to evaluate ENV["TF_USE_GPU"] but got error: KeyError("TF_USE_GPU")
Trying to evaluate ENV["LIBTENSORFLOW"] but got error: KeyError("LIBTENSORFLOW")
tf_version(kind=:backend) = 1.10.0
tf_version(kind=:python) = 1.10.0
tf_version(kind=:julia) = 0.10.2+
Python Status
PyCall.conda = false
Trying to evaluate ENV["PYTHON"] but got error: KeyError("PYTHON")
PyCall.PYTHONHOME = /home/alex/miniconda3:/home/alex/miniconda3
String(read(#= /home/alex/.julia/packages/TensorFlow/gwM1d/src/version.jl:104 =# @cmd("pip --version"))) = pip 18.1 from /home/alex/.local/lib/python3.6/site-packages/pip (python 3.6)
String(read(#= /home/alex/.julia/packages/TensorFlow/gwM1d/src/version.jl:105 =# @cmd("pip3 --version"))) = pip 18.1 from /home/alex/.local/lib/python3.6/site-packages/pip (python 3.6)
Julia Status
Julia Version 1.0.3
Commit 099e826241 (2018-12-18 01:34 UTC)
Platform Info:
OS: Linux (x86_64-pc-linux-gnu)
CPU: Intel(R) Core(TM) i5-2410M CPU @ 2.30GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-6.0.0 (ORCJIT, sandybridge)
Environment:
JULIA_EDITOR = atom -a
JULIA_NUM_THREADS = 2
JULIA_LOAD_PATH = @:/tmp/tmp065fxC
The text was updated successfully, but these errors were encountered: