Skip to content

Instantly share code, notes, and snippets.

@mishawagon
Last active June 24, 2019 23:41
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mishawagon/f334e36021b0d32b0a3793bff36cdbb3 to your computer and use it in GitHub Desktop.
Save mishawagon/f334e36021b0d32b0a3793bff36cdbb3 to your computer and use it in GitHub Desktop.
dadabotsSamleRNN Troubles
#first tried:
#$ THEANO_FLAGS=mode=FAST_RUN,device=cuda*,floatX=float32 python -u models/two_tier/two_tier32k.py --exp BEST_2TIER --n_frames 64 --frame_size 16 --emb_size 256 --skip_conn False --dim 1024 --n_rnn 3 --rnn_type GRU --q_levels 256 --q_type linear --batch_size 128 --weight_norm True --learn_h0 True --which_set krallice
#got this error:
Training!
0
Traceback (most recent call last):
File "models/two_tier/two_tier32k.py", line 614, in <module>
cost, h0 = train_fn(seqs, h0, reset, mask)
File "/home/cmstudio/.local/lib/python2.7/site-packages/theano/compile/function_module.py", line 917, in __call__
storage_map=getattr(self.fn, 'storage_map', None))
File "/home/cmstudio/.local/lib/python2.7/site-packages/theano/gof/link.py", line 325, in raise_with_op
reraise(exc_type, exc_value, exc_trace)
File "/home/cmstudio/.local/lib/python2.7/site-packages/theano/compile/function_module.py", line 903, in __call__
self.fn() if output_subset is None else\
**ValueError: y_i value out of bounds**
Apply node that caused the error: CrossentropySoftmaxArgmax1HotWithBias(Dot22.0, SampleLevel.Output.b, Reshape{1}.0)
Toposort index: 174
Inputs types: [TensorType(float32, matrix), TensorType(float32, vector), TensorType(int32, vector)]
Inputs shapes: [(131072, 256), (256,), (131072,)]
Inputs strides: [(1024, 4), (4,), (4,)]
Inputs values: ['not shown', 'not shown', 'not shown']
Inputs type_num: [11, 11, 5]
Outputs clients: [[Reshape{2}(CrossentropySoftmaxArgmax1HotWithBias.0, Shape.0), Shape(CrossentropySoftmaxArgmax1HotWithBias.0)], [CrossentropySoftmax1HotWithBiasDx(Reshape{1}.0, CrossentropySoftmaxArgmax1HotWithBias.1, Reshape{1}.0)], []]
**HINT**: Re-running with most Theano optimization disabled could give you a back-trace of when this node was created. This can be done with by setting the Theano flag 'optimizer=fast_compile'. If that does not work, Theano optimizations can be disabled with 'optimizer=None'.
#Because of the hint, I tried this below, but no cigar
$ THEANO_FLAGS=optimizer=fast_compile,exception_verbosity=high,device=cuda*,floatX=float32 python -u models/two_tier/two_tier32k.py --exp BEST_2TIER --n_frames 64 --frame_size 16 --emb_size 256 --skip_conn False --dim 1024 --n_rnn 3 --rnn_type GRU --q_levels 256 --q_type linear --batch_size 128 --weight_norm True --learn_h0 True --which_set krallice
#Found this and a bunch of stuff about this being about a kind of off-by-one error for targets, but not sure:
#https://stackoverflow.com/questions/41990665/theano-valueerror-y-i-value-out-of-bound
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment