【问题标题】:Unable to complete this question due to syntax error in the Python code for tensorflow?由于 tensorflow 的 Python 代码中的语法错误,无法完成此问题?
【发布时间】:2020-03-27 05:02:54
【问题描述】:

“返回”在函数之外。我必须返回元组中的值。基本上,这里有两个错误。首先,“返回”在函数之外。其次,结果没有作为元组返回。

def train_mnist():

class myCallback(tf.keras.callbacks.Callback):

    def on_epoch_end(self, epoch, logs={}):
        if logs.get('acc') > 0.99:
            print ('\nReached 99% accuracy so cancelling training!')
        self.model.stop_training = True

mnist = tf.keras.datasets.mnist

((x_train, y_train), (x_test, y_test)) = mnist.load_data(path=path)
(x_train, x_test) = (x_train / 255.0, x_test / 255.0)

callbacks = myCallback()

model = \
    tf.keras.models.Sequential([tf.keras.layers.Flatten(input_shape=(28,
                               28)), tf.keras.layers.Dense(512,
                               activation=tf.nn.relu),
                               tf.keras.layers.Dense(10,
                               activation=tf.nn.softmax)])
model.compile(optimizer='adam',
              loss='sparse_categorical_crossentropy',
              metrics=['accuracy'])

history = model.fit(x_train, y_train, epochs=10,
                    callbacks=[callbacks])


return (history.epoch, history.history['acc'][-1])

【问题讨论】:

  • 缺少很多缩进,所以我们无法确定原始代码中的缩进应该是什么样子。
  • 你为什么使用 return 命令,你的模型不在函数内部。

标签: python tensorflow machine-learning syntax-error


【解决方案1】:

问题在于缩进并从model log 获取准确性。

我已将您的代码修改如下,并获得了预期的输出。

def train_mnist():

  class myCallback(tf.keras.callbacks.Callback):

      def on_epoch_end(self, epoch, logs):
          if logs["accuracy"] > 0.99:
              print ('\nReached 99% accuracy so cancelling training!')
              self.model.stop_training = True

  mnist = tf.keras.datasets.mnist

  ((x_train, y_train), (x_test, y_test)) = mnist.load_data()
  (x_train, x_test) = (x_train / 255.0, x_test / 255.0)

  callbacks = myCallback()

  model = \
      tf.keras.models.Sequential([tf.keras.layers.Flatten(input_shape=(28,
                                28)), tf.keras.layers.Dense(512,
                                activation=tf.nn.relu),
                                tf.keras.layers.Dense(10,
                                activation=tf.nn.softmax)])
  model.compile(optimizer='adam',
                loss='sparse_categorical_crossentropy',
                metrics=['accuracy'])

  history = model.fit(x_train, y_train, epochs=10,
                      callbacks=[callbacks])


  return (history.epoch, history.history['accuracy'][-1]) 

输出:

Epoch 1/10
1875/1875 [==============================] - 6s 3ms/step - loss: 0.2026 - accuracy: 0.9392
Epoch 2/10
1875/1875 [==============================] - 6s 3ms/step - loss: 0.0799 - accuracy: 0.9755
Epoch 3/10
1875/1875 [==============================] - 6s 3ms/step - loss: 0.0521 - accuracy: 0.9839
Epoch 4/10
1875/1875 [==============================] - 6s 3ms/step - loss: 0.0353 - accuracy: 0.9894
Epoch 5/10
1867/1875 [============================>.] - ETA: 0s - loss: 0.0278 - accuracy: 0.9910
Reached 99% accuracy so cancelling training!
1875/1875 [==============================] - 6s 3ms/step - loss: 0.0278 - accuracy: 0.9910
([0, 1, 2, 3, 4], 0.9909833073616028)

【讨论】:

  • @A. Khan - 如果您的问题通过上述答案得到解决,请接受并投票。
猜你喜欢
  • 1970-01-01
  • 1970-01-01
  • 2019-05-19
  • 1970-01-01
  • 2018-09-24
  • 1970-01-01
  • 1970-01-01
  • 2014-03-20
  • 2012-04-28
相关资源
最近更新 更多