DynGEM发表于2018年,是一种结构保留动态网络表征学习算法,用于生成动态 graph 的稳定嵌入。

论文链接:https://arxiv.xilesou.top/pdf/1805.11273.pdf


主要贡献:

  1. 利用深度自动编码器来生成高度的非线性嵌入向量;
  2. 将时刻 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 的嵌入向量作为时刻 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 的嵌入向量的初始值,随后进行训练,这不仅保证了嵌入向量的稳定性,也保证了训练的效率(收敛速率很快);
  3. 因为动态网络中会出现新增的节点,提出了PropSize,来动态的增大神经网络的规模(深度与宽度)
  4. 为动态网络嵌入引入了稳定性指标

1 常用符号

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs


2 稳定性指标

成功的动态网络嵌入算法在时间上应该是稳定的嵌入。

稳定的动态嵌入:如果网络在两个连续时刻上变化很小,那两个连续时刻上的嵌入向量改变也很小。

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs:节点集 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 的加权邻接矩阵

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs:节点集 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 中所有节点的嵌入向量

绝对稳定性:

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs

相对稳定性:

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs

稳定性常数:

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs

该常数越小,稳定性越高


3 DynGEM

3.1 自动编码器

DynGEM利用深度编码器模型将输入数据映射到高度非线性空间来捕捉当前时刻网络的连通性趋势。

这个模型是一个半监督方法,最小化了对应于一阶近似度和二阶近似度的两个目标函数的组合。

自动编码器如下图所示。

  • 节点 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 的邻域表示为 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs
  • 对当前网络 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 的任意节点对 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs,编码器取其邻域【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs作为输入,传入自动编码器中,生成 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 维向量嵌入【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs
  • 编码器还会从向量【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs重构节点邻域【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs

3.2 处理growing graph

由于动态网络中会新增节点,需要扩展自动编码器模型,同时保留前一时刻的训练权重。

关键是要确定神经网络的隐藏层数和每层的隐藏单元数,因此,提出了PropSize。

3.2.1 PropSize

用于计算每个时刻神经网络的大小,如果需要的话可以插入新的层。

对于自动编码器两个连续的层 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs(从【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs开始),需满足以下条件:

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs是一个超参数。

  • 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs不满足条件,则增加 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 的宽度(隐藏单元个数)
  • 嵌入层 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 的大小保持 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 不变
  • 若倒数第二层和嵌入层之间的规则不满足,则应该两层之间增加新的层
  • 这个规则同样适用于解码过程

3.3 损失函数与训练

为学习模型参数,需要最小化三个目标函数的加权组合:

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 是超参数

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs为一阶近似度,对应网络的局部结构

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs为二阶近似度,对应每个节点的全局邻域。

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs是一个向量,【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs,对网络中存在的边的不正确重构比不存在的边的不正确重构的惩罚要大。

正则化项:

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs

DynGEM在每个时刻学习深度编码器的参数 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs,将【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs作为当前时刻的嵌入输出。

3.4 稳定性

对于【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs

  • 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs,使用随机初始化的 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs,来完整地训练深度自动编码器
  • 对于随后的时刻,在加深、加宽编码器之前,使用 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 来初始化模型参数 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs;这导致了【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 到 【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs 的直接知识转移,模型只需学习两个网络之间的改变,收敛很快,同时保证了稳定性

3.5 扩展技术

  • 自动编码器中使用ReLU作为**函数
  • 使用nesterov momentum来调节超参数,与纯使用SGD相比,收敛更快
  • L1正则化与L2正则化相结合能实现更好的性能

DynGEM的伪代码如下所示:

【论文解读】DynGEM: Deep Embedding Method for Dynamic Graphs


有任何错误欢迎留言讨论

论文链接已在开头给出

转载请联系我授权,并注明出处。

相关文章: