【发布时间】:2021-05-29 09:58:45
【问题描述】:
我已经在github(带有可用数据集)上安排了我的代码。
问题是我想实现一个无监督域对抗训练网络(DANN)(see paper) 使用 tf2.keras 代码,而大多数答案是 tf1 或纯 keras 版本,他们没有直接考虑在 tf1 和 tf2 之间切换,并简单地禁用 Eager Execution。
当我尝试在我的 domain_classifier 的开头添加一个自定义的 gradient_reversal 层时,如下所示:
@tf.custom_gradient
def reverse_gradient(X, hp_lambda):
"""Flips the sign of the incoming gradient during training."""
try:
reverse_gradient.num_calls += 1
except AttributeError:
reverse_gradient.num_calls = 1
grad_name = "GradientReversal%d" % reverse_gradient.num_calls
@ops.RegisterGradient(grad_name)
def _flip_gradients(grad):
return [tf.negative(grad) * hp_lambda]
# g = K.get_session().graph
with tf.Graph().as_default() as g:
with g.gradient_override_map({'Identity': grad_name}):
y = tf.identity(X)
return y
from tensorflow.python.keras.engine.base_layer import Layer # I use base_layer, and most errors are coming from here.
class GradientReversal(Layer):
"""Layer that flips the sign of gradient during training."""
def __init__(self, hp_lambda, **kwargs):
super(GradientReversal, self).__init__(**kwargs)
self.supports_masking = True
self.hp_lambda = hp_lambda
# @staticmethod
def get_output_shape_for(input_shape):
return input_shape
def build(self, input_shape):
# self.trainable_weights = []
return
def call(self, x, mask=None):
return reverse_gradient(x, self.hp_lambda)
def compute_output_shape(self, input_shape):
return input_shape
def get_config(self):
config = {}
base_config = super(GradientReversal, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
发生了很多错误:
Traceback (most recent call last):
File "D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 1117, in _functional_construction_call
outputs = call_fn(cast_inputs, *args, **kwargs)
File "D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\autograph\impl\api.py", line 258, in wrapper
raise e.ag_error_metadata.to_exception(e)
tensorflow.python.framework.errors_impl.OperatorNotAllowedInGraphError: in user code:
D:/Skill-worker-research/Python code and example data/SupplementarySoftware_DeepHL_python/DeepHL_python/danntest/main.py:117 call *
return reverse_gradient(x, self.hp_lambda)
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\ops\custom_gradient.py:264 __call__ **
return self._d(self._f, a, k)
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\ops\custom_gradient.py:220 decorated
return _graph_mode_decorator(wrapped, args, kwargs)
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\ops\custom_gradient.py:325 _graph_mode_decorator
result, grad_fn = f(*args)
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\framework\ops.py:503 __iter__
self._disallow_iteration()
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\framework\ops.py:499 _disallow_iteration
self._disallow_in_graph_mode("iterating over `tf.Tensor`")
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\framework\ops.py:479 _disallow_in_graph_mode
" this function with @tf.function.".format(task))
OperatorNotAllowedInGraphError: iterating over `tf.Tensor` is not allowed in Graph execution. Use Eager execution or decorate this function with @tf.function.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 1124, in _functional_construction_call
'\n"""')
TypeError: You are attempting to use Python control flow in a layer that was not declared to be dynamic. Pass `dynamic=True` to the class constructor.
Encountered error:
"""
in user code:
D:/Skill-worker-research/Python code and example data/SupplementarySoftware_DeepHL_python/DeepHL_python/danntest/main.py:117 call *
return reverse_gradient(x, self.hp_lambda)
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\ops\custom_gradient.py:264 __call__ **
return self._d(self._f, a, k)
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\ops\custom_gradient.py:220 decorated
return _graph_mode_decorator(wrapped, args, kwargs)
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\ops\custom_gradient.py:325 _graph_mode_decorator
result, grad_fn = f(*args)
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\framework\ops.py:503 __iter__
self._disallow_iteration()
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\framework\ops.py:499 _disallow_iteration
self._disallow_in_graph_mode("iterating over `tf.Tensor`")
D:\Users\xiqxi\Anaconda3\envs\tf2\lib\site-packages\tensorflow\python\framework\ops.py:479 _disallow_in_graph_mode
" this function with @tf.function.".format(task))
OperatorNotAllowedInGraphError: iterating over `tf.Tensor` is not allowed in Graph execution. Use
Eager execution or decorate this function with @tf.function.
"""
Process finished with exit code -1
我注意到@tf.custom_gradient是tf2代码,但是tf2在构建网络结构时不再使用tf.Graph()来生成静态图,所以我还是试试这个代码:
@tf.custom_gradient
def GradientReversalOperator(x):
def grad(dy):
return -1 * dy
return x, grad
但又出现了一个错误。我猜这是因为 GradientReversal 类继承了包含 tf1 语法的 tf.keras.base_layer。即使我尝试了这么多方法,我也无法解决这个问题。
我上传了我的代码并提出了这个问题。希望有人可以帮助我解决问题并告诉我 tf.keras.base_layer 是如何工作的,以及为什么它在我的代码中不可用。
如果您能给我任何建议,我将非常感激。
再次感谢您的帮助!
【问题讨论】:
标签: deep-learning tensorflow2.0 keras-layer tf.keras generative-adversarial-network