【问题标题】:"Variable weights already exists" in RNN sample code from tutorial教程中的 RNN 示例代码中的“变量权重已经存在”
【发布时间】:2017-03-24 10:54:39
【问题描述】:

我想从https://www.tensorflow.org/api_docs/python/tf/contrib/rnn/static_rnn 重新实现 RNN 步进循环 但这对我不起作用。 当重用设置为 True 时,我得到“变量 test/basic_lstm_cell/weights 已经存在”而没有重用和“变量 test/basic_lstm_cell/weights 不存在”。

import tensorflow as tf
batch_size = 32
n_steps = 10
lstm_size = 10
n_input = 17

words = tf.placeholder(tf.float32, [batch_size, n_steps, n_input])
words = tf.transpose(words, [1, 0, 2])
words = tf.reshape(words, [-1, n_input])
words = tf.split(words, n_steps, 0)

with tf.variable_scope('test', reuse=True):
    cell = tf.contrib.rnn.BasicLSTMCell(lstm_size)
    state = cell.zero_state(batch_size, dtype=tf.float32)
    outputs = []
    for input_ in words:
        output, state = cell(input_, state)
        outputs.append(output)

【问题讨论】:

    标签: python machine-learning tensorflow lstm recurrent-neural-network


    【解决方案1】:

    看看the source of the function you are trying to re-implement。重要的是在循环的第一次迭代中没有设置重用标志,但在所有其他迭代中都设置了它。因此,在您的情况下,包含具有该范围标志常量的循环的范围将不起作用,您必须执行类似

    with tf.variable_scope('test') as scope:
        cell = tf.contrib.rnn.BasicLSTMCell(lstm_size)
        state = cell.zero_state(batch_size, dtype=tf.float32)
        outputs = []
        for step, input_ in enumerate(words):
            if step > 0:
                scope.reuse_variables()
            output, state = cell(input_, state)
            outputs.append(output)
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2012-03-26
      • 2020-03-06
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 2016-07-16
      相关资源
      最近更新 更多