【问题标题】:Speeding up a pytorch tensor operation加速 pytorch 张量操作
【发布时间】:2022-08-18 18:38:51
【问题描述】:

我试图通过做某种矩阵/向量乘法来加速下面的操作,有人能看到一个很好的快速解决方案吗? 它也应该适用于张量的形状为 0 (torch.Size([])) 但我无法初始化这样的张量的特殊情况。 有关我所指的张量类型,请参见下图: tensor to add to test

def adstock_geometric(x: torch.Tensor, theta: float):
    x_decayed = torch.zeros_like(x)
    x_decayed[0] = x[0]

    for xi in range(1, len(x_decayed)):
        x_decayed[xi] = x[xi] + theta * x_decayed[xi - 1]

    return x_decayed

def adstock_multiple_samples(x: torch.Tensor, theta: torch.Tensor):

    listtheta = theta.tolist()
    if isinstance(listtheta, float):
        return adstock_geometric(x=x,
                                 theta=theta)
    x_decayed = torch.zeros((100, 112, 1))
    for idx, theta_ in enumerate(listtheta):
        x_decayed_one_entry = adstock_geometric(x=x,
                                                theta=theta_)
        x_decayed[idx] = x_decayed_one_entry
    return x_decayed

if __name__ == \'__main__\':
    ones = torch.tensor([1])
    hundreds = torch.tensor([idx for idx in range(100)])
    x = torch.tensor([[idx] for idx in range(112)])
    ones = adstock_multiple_samples(x=x,
                                    theta=ones)
    hundreds = adstock_multiple_samples(x=x,
                                        theta=hundreds)
    print(ones)
    print(hundreds)

  • 为什么投反对票?

标签: python pytorch vectorization tensor


【解决方案1】:

我想出了以下几点:

import torch

def adstock_multiple_samples(x: torch.Tensor, theta: torch.Tensor):
    powers = (torch.arange(len(x))[:, None] - torch.arange(len(x))).clip(0)
    return ((theta[:, None, None] ** powers[None, :, :]).tril() * x).sum(-1)

它的行为符合预期:

>>> x = torch.arange(5)
>>> theta = torch.arange(3)
>>> adstock_multiple_samples(x, theta)
tensor([[ 0,  1,  2,  3,  4],
        [ 0,  1,  3,  6, 10],
        [ 0,  1,  4, 11, 26]])

请注意,它也适用于theta = torch.empty((0,)),它返回一个空张量。

【讨论】:

    猜你喜欢
    • 2020-12-06
    • 2021-01-10
    • 2020-10-21
    • 2021-04-27
    • 2021-02-07
    • 1970-01-01
    • 2020-11-01
    • 2021-10-11
    • 2022-10-17
    相关资源
    最近更新 更多